Sample records for requires efficient tools

  1. Development of High Efficiency (14%) Solar Cell Array Module

    NASA Technical Reports Server (NTRS)

    Iles, P. A.; Khemthong, S.; Olah, S.; Sampson, W. J.; Ling, K. S.

    1979-01-01

    High efficiency solar cells required for the low cost modules was developed. The production tooling for the manufacture of the cells and modules was designed. The tooling consisted of: (1) back contact soldering machine; (2) vacuum pickup; (3) antireflective coating tooling; and (4) test fixture.

  2. New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiss, T.; Chaney, L.; Meyer, J.

    Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less

  3. Exploration of depth modeling mode one lossless wedgelets storage strategies for 3D-high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Sanchez, Gustavo; Marcon, César; Agostini, Luciano Volcan

    2018-01-01

    The 3D-high efficiency video coding has introduced tools to obtain higher efficiency in 3-D video coding, and most of them are related to the depth maps coding. Among these tools, the depth modeling mode-1 (DMM-1) focuses on better encoding edges regions of depth maps. The large memory required for storing all wedgelet patterns is one of the bottlenecks in the DMM-1 hardware design of both encoder and decoder since many patterns must be stored. Three algorithms to reduce the DMM-1 memory requirements and a hardware design targeting the most efficient among these algorithms are presented. Experimental results demonstrate that the proposed solutions surpass related works reducing up to 78.8% of the wedgelet memory, without degrading the encoding efficiency. Synthesis results demonstrate that the proposed algorithm reduces almost 75% of the power dissipation when compared to the standard approach.

  4. An end user evaluation of query formulation and results review tools in three medical meta-search engines.

    PubMed

    Leroy, Gondy; Xu, Jennifer; Chung, Wingyan; Eggers, Shauna; Chen, Hsinchun

    2007-01-01

    Retrieving sufficient relevant information online is difficult for many people because they use too few keywords to search and search engines do not provide many support tools. To further complicate the search, users often ignore support tools when available. Our goal is to evaluate in a realistic setting when users use support tools and how they perceive these tools. We compared three medical search engines with support tools that require more or less effort from users to form a query and evaluate results. We carried out an end user study with 23 users who were asked to find information, i.e., subtopics and supporting abstracts, for a given theme. We used a balanced within-subjects design and report on the effectiveness, efficiency and usability of the support tools from the end user perspective. We found significant differences in efficiency but did not find significant differences in effectiveness between the three search engines. Dynamic user support tools requiring less effort led to higher efficiency. Fewer searches were needed and more documents were found per search when both query reformulation and result review tools dynamically adjust to the user query. The query reformulation tool that provided a long list of keywords, dynamically adjusted to the user query, was used most often and led to more subtopics. As hypothesized, the dynamic result review tools were used more often and led to more subtopics than static ones. These results were corroborated by the usability questionnaires, which showed that support tools that dynamically optimize output were preferred.

  5. Knowledge management: An abstraction of knowledge base and database management systems

    NASA Technical Reports Server (NTRS)

    Riedesel, Joel D.

    1990-01-01

    Artificial intelligence application requirements demand powerful representation capabilities as well as efficiency for real-time domains. Many tools exist, the most prevalent being expert systems tools such as ART, KEE, OPS5, and CLIPS. Other tools just emerging from the research environment are truth maintenance systems for representing non-monotonic knowledge, constraint systems, object oriented programming, and qualitative reasoning. Unfortunately, as many knowledge engineers have experienced, simply applying a tool to an application requires a large amount of effort to bend the application to fit. Much work goes into supporting work to make the tool integrate effectively. A Knowledge Management Design System (KNOMAD), is described which is a collection of tools built in layers. The layered architecture provides two major benefits; the ability to flexibly apply only those tools that are necessary for an application, and the ability to keep overhead, and thus inefficiency, to a minimum. KNOMAD is designed to manage many knowledge bases in a distributed environment providing maximum flexibility and expressivity to the knowledge engineer while also providing support for efficiency.

  6. DooSo6: Easy Collaboration over Shared Projects

    NASA Astrophysics Data System (ADS)

    Ignat, Claudia-Lavinia; Oster, Gérald; Molli, Pascal

    Existing tools for supporting parallel work feature some disadvantages that prevent them to be widely used. Very often they require a complex installation and creation of accounts for all group members. Users need to learn and deal with complex commands for efficiently using these collaborative tools. Some tools require users to abandon their favourite editors and impose them to use a certain co-authorship application. In this paper, we propose the DooSo6 collaboration tool that offers support for parallel work, requires no installation, no creation of accounts and that is easy to use, users being able to continue working with their favourite editors. User authentication is achieved by means of a capability-based mechanism.

  7. 47 CFR 80.867 - Ship station tools, instruction books, circuit diagrams and testing equipment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 5 2013-10-01 2013-10-01 false Ship station tools, instruction books, circuit... Requirements for Cargo Vessels Not Subject to Subpart W § 80.867 Ship station tools, instruction books, circuit..., instruction books and circuit diagrams to enable the radiotelephone installation to be maintained in efficient...

  8. 47 CFR 80.867 - Ship station tools, instruction books, circuit diagrams and testing equipment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Ship station tools, instruction books, circuit... Requirements for Cargo Vessels Not Subject to Subpart W § 80.867 Ship station tools, instruction books, circuit..., instruction books and circuit diagrams to enable the radiotelephone installation to be maintained in efficient...

  9. 47 CFR 80.867 - Ship station tools, instruction books, circuit diagrams and testing equipment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 5 2014-10-01 2014-10-01 false Ship station tools, instruction books, circuit... Requirements for Cargo Vessels Not Subject to Subpart W § 80.867 Ship station tools, instruction books, circuit..., instruction books and circuit diagrams to enable the radiotelephone installation to be maintained in efficient...

  10. 47 CFR 80.867 - Ship station tools, instruction books, circuit diagrams and testing equipment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Ship station tools, instruction books, circuit... Requirements for Cargo Vessels Not Subject to Subpart W § 80.867 Ship station tools, instruction books, circuit..., instruction books and circuit diagrams to enable the radiotelephone installation to be maintained in efficient...

  11. GAPIT version 2: an enhanced integrated tool for genomic association and prediction

    USDA-ARS?s Scientific Manuscript database

    Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...

  12. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  13. A robot end effector exchange mechanism for space applications

    NASA Technical Reports Server (NTRS)

    Gorin, Barney F.

    1990-01-01

    Efficient robot operation requires the use of specialized end effectors or tools for tasks. In spacecraft applications, the microgravity environment precludes the use of gravitational forces to retain the tools in holding fixture. As a result of this, a retention mechanism which forms a part of the tool storage container is required. A unique approach to this problem has resulted in the development of an end effector exchange mechanism that meets the requirements for spaceflight applications while avoiding the complexity usually involved. This mechanism uses multiple latching cams both on the manipulator and in the tool storage container, combined with a system of catch rings to provide retention in both locations and the required failure tolerance. Because of the cam configuration the mechanism operates passively, requiring no electrical commands except those needed to move the manipulator into position. Similarly, it inherently provides interlocks to prevent the release of one cam before its opposite number is engaged.

  14. Live minimal path for interactive segmentation of medical images

    NASA Astrophysics Data System (ADS)

    Chartrand, Gabriel; Tang, An; Chav, Ramnada; Cresson, Thierry; Chantrel, Steeve; De Guise, Jacques A.

    2015-03-01

    Medical image segmentation is nowadays required for medical device development and in a growing number of clinical and research applications. Since dedicated automatic segmentation methods are not always available, generic and efficient interactive tools can alleviate the burden of manual segmentation. In this paper we propose an interactive segmentation tool based on image warping and minimal path segmentation that is efficient for a wide variety of segmentation tasks. While the user roughly delineates the desired organs boundary, a narrow band along the cursors path is straightened, providing an ideal subspace for feature aligned filtering and minimal path algorithm. Once the segmentation is performed on the narrow band, the path is warped back onto the original image, precisely delineating the desired structure. This tool was found to have a highly intuitive dynamic behavior. It is especially efficient against misleading edges and required only coarse interaction from the user to achieve good precision. The proposed segmentation method was tested for 10 difficult liver segmentations on CT and MRI images, and the resulting 2D overlap Dice coefficient was 99% on average..

  15. Research on the tool holder mode in high speed machining

    NASA Astrophysics Data System (ADS)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  16. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  17. Efficient monitoring of CRAB jobs at CMS

    NASA Astrophysics Data System (ADS)

    Silva, J. M. D.; Balcas, J.; Belforte, S.; Ciangottini, D.; Mascheroni, M.; Rupeika, E. A.; Ivanov, T. T.; Hernandez, J. M.; Vaandering, E.

    2017-10-01

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates the design choices and gives a report on our experience with the tools we developed and the external ones we used.

  18. Efficient Monitoring of CRAB Jobs at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, J. M.D.; Balcas, J.; Belforte, S.

    CRAB is a tool used for distributed analysis of CMS data. Users can submit sets of jobs with similar requirements (tasks) with a single request. CRAB uses a client-server architecture, where a lightweight client, a server, and ancillary services work together and are maintained by CMS operators at CERN. As with most complex software, good monitoring tools are crucial for efficient use and longterm maintainability. This work gives an overview of the monitoring tools developed to ensure the CRAB server and infrastructure are functional, help operators debug user problems, and minimize overhead and operating cost. This work also illustrates themore » design choices and gives a report on our experience with the tools we developed and the external ones we used.« less

  19. A Qualitative Analysis of Information Sharing in Hospice Interdisciplinary Group Meetings.

    PubMed

    Washington, Karla T; Demiris, George; Parker Oliver, Debra; Swarz, Jeffrey A; Lewis, Alexandria M; Backonja, Uba

    2017-12-01

    In the United States, hospice agencies are required to convene interdisciplinary group (IDG) meetings no less frequently than every 15 days to review patients' care plans. Challenges associated with information sharing during these meetings can impede efficiency and frustrate attendees. We sought to examine information sharing in the context of hospice IDG meetings as a first step toward developing an informatics tool to support interdisciplinary collaboration in this setting. Specifically, we wanted to better understand the purpose of information sharing in IDG meetings and determine the type(s) of information required to fulfill that purpose. Methods, Setting, and Participants: In this qualitative descriptive study, we analyzed video recordings of care plan discussions (n = 57) in hospice IDG meetings and individual interviews of hospice providers (n = 24). Data indicated that sharing physical, psychosocial, and spiritual information is intended to optimize hospice teams' ability to deliver whole-person care that is aligned with patient and family goals and that satisfies regulatory requirements. Information sharing is a key function of hospice teams in IDG meetings. Informatics tools may optimize IDG meeting efficiency by succinctly presenting well-organized and required information that is relevant to all team members. Such tools should highlight patient and family goals and ensure that teams are able to satisfy regulatory requirements.

  20. Automated observation scheduling for the VLT

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    It is becoming increasingly evident that, in order to optimize the observing efficiency of large telescopes, some changes will be required in the way observations are planned and executed. Not all observing programs require the presence of the astronomer at the telescope: for those programs which permit service observing it is possible to better match planned observations to conditions at the telescope. This concept of flexible scheduling has been proposed for the VLT: based on current and predicted environmental and instrumental observations which make the most efficient possible use of valuable time. A similar kind of observation scheduling is already necessary for some space observatories, such as Hubble Space Telescope (HST). Space Telescope Science Institute is presently developing scheduling tools for HST, based on the use of artificial intelligence software development techniques. These tools could be readily adapted for ground-based telescope scheduling since they address many of the same issues. The concept are described on which the HST tools are based, their implementation, and what would be required to adapt them for use with the VLT and other ground-based observatories.

  1. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  2. GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis

    NASA Astrophysics Data System (ADS)

    Nass, Andrea; van Gasselt, Stephan

    2015-04-01

    Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.

  3. An Evaluation of the Predictive Validity of Confidence Ratings in Identifying Functional Behavioral Assessment Hypothesis Statements

    ERIC Educational Resources Information Center

    Borgmeier, Chris; Horner, Robert H.

    2006-01-01

    Faced with limited resources, schools require tools that increase the accuracy and efficiency of functional behavioral assessment. Yarbrough and Carr (2000) provided evidence that informant confidence ratings of the likelihood of problem behavior in specific situations offered a promising tool for predicting the accuracy of function-based…

  4. A Synthesized Coding Framework for Asynchronous Online Discussion Research

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa Michelle

    2014-01-01

    Online classroom discussion is ubiquitous in higher education today, with both online and hybrid courses. As a result, tools need to be created that enable an in-depth assessment of this medium, thereby facilitating the establishment and support of best practices in education. Such an assessment requires tools for consistent, efficient analysis of…

  5. Computer tools for systems engineering at LaRC

    NASA Technical Reports Server (NTRS)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  6. Progressive Damage and Failure Analysis of Composite Laminates

    NASA Astrophysics Data System (ADS)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis tool is validated by comparing the simulations against experiments for a selected number of quasi-static loading cases.

  7. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  8. Re-tooling critical care to become a better intensivist: something old and something new.

    PubMed

    Marini, John J

    2015-01-01

    Developments in recent years have placed powerful new tools of diagnosis, therapy, and communication at the disposal of medicine in general, and of critical care in particular. The art of healing requires not only technical proficiency, but also personal connection, multidisciplinary teamwork, and commitment to the venerable traditions of our profession. The latter often seem to be under assault by today's high-pressure, high-efficiency, and increasingly business-driven hospital environments. Re-tooling critical care for the future generations of caregivers requires something old--empathetic connection--as well as the exciting newer technologies of our science and practice.

  9. Re-tooling critical care to become a better intensivist: something old and something new

    PubMed Central

    2015-01-01

    Developments in recent years have placed powerful new tools of diagnosis, therapy, and communication at the disposal of medicine in general, and of critical care in particular. The art of healing requires not only technical proficiency, but also personal connection, multidisciplinary teamwork, and commitment to the venerable traditions of our profession. The latter often seem to be under assault by today's high-pressure, high-efficiency, and increasingly business-driven hospital environments. Re-tooling critical care for the future generations of caregivers requires something old--empathetic connection--as well as the exciting newer technologies of our science and practice. PMID:26728560

  10. Reflective Writing for a Better Understanding of Scientific Concepts in High School

    ERIC Educational Resources Information Center

    El-Helou, Joseph; Kalman, Calvin S.

    2018-01-01

    Science teachers can always benefit from efficient tools that help students to engage with the subject and understand it better without significantly adding to the teacher's workload nor requiring too much of class time to manage. Reflective writing is such a low-impact, high-return tool. What follows is an introduction to reflective writing, and…

  11. English Digital Dictionaries as Valuable Blended Learning Tools for Palestinian College Students

    ERIC Educational Resources Information Center

    Dwaik, Raghad A. A.

    2015-01-01

    Digital technology has become an indispensable aspect of foreign language learning around the globe especially in the case of college students who are often required to finish extensive reading assignments within a limited time period. Such pressure calls for the use of efficient tools such as digital dictionaries to help them achieve their…

  12. NEUROSCIENCE. Natural light-gated anion channels: A family of microbial rhodopsins for advanced optogenetics.

    PubMed

    Govorunova, Elena G; Sineshchekov, Oleg A; Janz, Roger; Liu, Xiaoqin; Spudich, John L

    2015-08-07

    Light-gated rhodopsin cation channels from chlorophyte algae have transformed neuroscience research through their use as membrane-depolarizing optogenetic tools for targeted photoactivation of neuron firing. Photosuppression of neuronal action potentials has been limited by the lack of equally efficient tools for membrane hyperpolarization. We describe anion channel rhodopsins (ACRs), a family of light-gated anion channels from cryptophyte algae that provide highly sensitive and efficient membrane hyperpolarization and neuronal silencing through light-gated chloride conduction. ACRs strictly conducted anions, completely excluding protons and larger cations, and hyperpolarized the membrane of cultured animal cells with much faster kinetics at less than one-thousandth of the light intensity required by the most efficient currently available optogenetic proteins. Natural ACRs provide optogenetic inhibition tools with unprecedented light sensitivity and temporal precision. Copyright © 2015, American Association for the Advancement of Science.

  13. Machine tools and fixtures: A compilation

    NASA Technical Reports Server (NTRS)

    1971-01-01

    As part of NASA's Technology Utilizations Program, a compilation was made of technological developments regarding machine tools, jigs, and fixtures that have been produced, modified, or adapted to meet requirements of the aerospace program. The compilation is divided into three sections that include: (1) a variety of machine tool applications that offer easier and more efficient production techniques; (2) methods, techniques, and hardware that aid in the setup, alignment, and control of machines and machine tools to further quality assurance in finished products: and (3) jigs, fixtures, and adapters that are ancillary to basic machine tools and aid in realizing their greatest potential.

  14. Front panel engineering with CAD simulation tool

    NASA Astrophysics Data System (ADS)

    Delacour, Jacques; Ungar, Serge; Mathieu, Gilles; Hasna, Guenther; Martinez, Pascal; Roche, Jean-Christophe

    1999-04-01

    THe progress made recently in display technology covers many fields of application. The specification of radiance, colorimetry and lighting efficiency creates some new challenges for designers. Photometric design is limited by the capability of correctly predicting the result of a lighting system, to save on the costs and time taken to build multiple prototypes or bread board benches. The second step of the research carried out by company OPTIS is to propose an optimization method to be applied to the lighting system, developed in the software SPEOS. The main features of the tool requires include the CAD interface, to enable fast and efficient transfer between mechanical and light design software, the source modeling, the light transfer model and an optimization tool. The CAD interface is mainly a prototype of transfer, which is not the subjects here. Photometric simulation is efficiently achieved by using the measured source encoding and a simulation by the Monte Carlo method. Today, the advantages and the limitations of the Monte Carlo method are well known. The noise reduction requires a long calculation time, which increases with the complexity of the display panel. A successful optimization is difficult to achieve, due to the long calculation time required for each optimization pass including a Monte Carlo simulation. The problem was initially defined as an engineering method of study. The experience shows that good understanding and mastering of the phenomenon of light transfer is limited by the complexity of non sequential propagation. The engineer must call for the help of a simulation and optimization tool. The main point needed to be able to perform an efficient optimization is a quick method for simulating light transfer. Much work has been done in this area and some interesting results can be observed. It must be said that the Monte Carlo method wastes time calculating some results and information which are not required for the needs of the simulation. Low efficiency transfer system cost a lot of lost time. More generally, the light transfer simulation can be treated efficiently when the integrated result is composed of elementary sub results that include quick analytical calculated intersections. The first axis of research appear. The quick integration research and the quick calculation of geometric intersections. The first axis of research brings some general solutions also valid for multi-reflection systems. The second axis requires some deep thinking on the intersection calculation. An interesting way is the subdivision of space in VOXELS. This is an adapted method of 3D division of space according to the objects and their location. An experimental software has been developed to provide a validation of the method. The gain is particularly high in complex systems. An important reduction in the calculation time has been achieved.

  15. Evaluation of the efficiency and fault density of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  16. A user's guide to coping with estuarine management bureaucracy: An Estuarine Planning Support System (EPSS) tool.

    PubMed

    Lonsdale, Jemma; Nicholson, Rose; Weston, Keith; Elliott, Michael; Birchenough, Andrew; Sühring, Roxana

    2018-02-01

    Estuaries are amongst the most socio-economically and ecologically important environments however, due to competing and conflicting demands, management is often challenging with a complex legislative framework managed by multiple agencies. To facilitate the understanding of this legislative framework, we have developed a GISbased Estuarine Planning Support System tool. The tool integrates the requirements of the relevant legislation and provides a basis for assessing the current environmental state of an estuary as well as informing and assessing new plans to ensure a healthy estuarine state. The tool ensures that the information is easily accessible for regulators, managers, developers and the public. The tool is intended to be adaptable, but is assessed using the Humber Estuary, United Kingdom as a case study area. The successful application of the tool for complex socio-economic and environmental systems demonstrates that the tool can efficiently guide users through the complex requirements needed to support sustainable development. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  17. MIMO: an efficient tool for molecular interaction maps overlap

    PubMed Central

    2013-01-01

    Background Molecular pathways represent an ensemble of interactions occurring among molecules within the cell and between cells. The identification of similarities between molecular pathways across organisms and functions has a critical role in understanding complex biological processes. For the inference of such novel information, the comparison of molecular pathways requires to account for imperfect matches (flexibility) and to efficiently handle complex network topologies. To date, these characteristics are only partially available in tools designed to compare molecular interaction maps. Results Our approach MIMO (Molecular Interaction Maps Overlap) addresses the first problem by allowing the introduction of gaps and mismatches between query and template pathways and permits -when necessary- supervised queries incorporating a priori biological information. It then addresses the second issue by relying directly on the rich graph topology described in the Systems Biology Markup Language (SBML) standard, and uses multidigraphs to efficiently handle multiple queries on biological graph databases. The algorithm has been here successfully used to highlight the contact point between various human pathways in the Reactome database. Conclusions MIMO offers a flexible and efficient graph-matching tool for comparing complex biological pathways. PMID:23672344

  18. Promoting a Culture of Tailoring for Systems Engineering Policy Expectations

    NASA Technical Reports Server (NTRS)

    Blankenship, Van A.

    2016-01-01

    NASA's Marshall Space Flight Center (MSFC) has developed an integrated systems engineering approach to promote a culture of tailoring for program and project policy requirements. MSFC's culture encourages and supports tailoring, with an emphasis on risk-based decision making, for enhanced affordability and efficiency. MSFC's policy structure integrates the various Agency requirements into a single, streamlined implementation approach which serves as a "one-stop-shop" for our programs and projects to follow. The engineers gain an enhanced understanding of policy and technical expectations, as well as lesson's learned from MSFC's history of spaceflight and science missions, to enable them to make appropriate, risk-based tailoring recommendations. The tailoring approach utilizes a standard methodology to classify projects into predefined levels using selected mission and programmatic scaling factors related to risk tolerance. Policy requirements are then selectively applied and tailored, with appropriate rationale, and approved by the governing authorities, to support risk-informed decisions to achieve the desired cost and schedule efficiencies. The policy is further augmented by implementation tools and lifecycle planning aids which help promote and support the cultural shift toward more tailoring. The MSFC Customization Tool is an integrated spreadsheet that ties together everything that projects need to understand, navigate, and tailor the policy. It helps them classify their project, understand the intent of the requirements, determine their tailoring approach, and document the necessary governance approvals. It also helps them plan for and conduct technical reviews throughout the lifecycle. Policy tailoring is thus established as a normal part of project execution, with the tools provided to facilitate and enable the tailoring process. MSFC's approach to changing the culture emphasizes risk-based tailoring of policy to achieve increased flexibility, efficiency, and effectiveness in project execution, while maintaining appropriate rigor to ensure mission success.

  19. Marshall Space Flight Center's Virtual Reality Applications Program 1993

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1993-01-01

    A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.

  20. Precision, accuracy, and efficiency of four tools for measuring soil bulk density or strength.

    Treesearch

    Richard E. Miller; John Hazard; Steven Howes

    2001-01-01

    Monitoring soil compaction is time consuming. A desire for speed and lower costs, however, must be balanced with the appropriate precision and accuracy required of the monitoring task. We compared three core samplers and a cone penetrometer for measuring soil compaction after clearcut harvest on a stone-free and a stony soil. Precision (i.e., consistency) of each tool...

  1. Oceanographic satellite remote sensing: Registration, rectification, and data integration requirements

    NASA Technical Reports Server (NTRS)

    Nichols, D. A.

    1982-01-01

    The problem of data integration in oceanography is discussed. Recommendations are made for technique development and evaluation, understanding requirements, and packaging techniques for speed, efficiency and ease of use. The primary satellite sensors of interest to oceanography are summarized. It is concluded that imaging type sensors make image processing an important tool for oceanographic studies.

  2. Effective gene delivery to Trypanosoma cruzi epimastigotes through nucleofection.

    PubMed

    Pacheco-Lugo, Lisandro; Díaz-Olmos, Yirys; Sáenz-García, José; Probst, Christian Macagnan; DaRocha, Wanderson Duarte

    2017-06-01

    New opportunities have raised to study the gene function approaches of Trypanosoma cruzi after its genome sequencing in 2005. Functional genomic approaches in Trypanosoma cruzi are challenging due to the reduced tools available for genetic manipulation, as well as to the reduced efficiency of the transient transfection conducted through conventional methods. The Amaxa nucleofector device was systematically tested in the present study in order to improve the electroporation conditions in the epimastigote forms of T. cruzi. The transfection efficiency was quantified using the green fluorescent protein (GFP) as reporter gene followed by cell survival assessment. The herein used nucleofection parameters have increased the survival rates (>90%) and the transfection efficiency by approximately 35%. The small amount of epimastigotes and DNA required for the nucleofection can turn the method adopted here into an attractive tool for high throughput screening (HTS) applications, and for gene editing in parasites where genetic manipulation tools remain relatively scarce. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Exploration Medical System Trade Study Tools Overview

    NASA Technical Reports Server (NTRS)

    Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.

    2018-01-01

    ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.

  4. Managing Information On Technical Requirements

    NASA Technical Reports Server (NTRS)

    Mauldin, Lemuel E., III; Hammond, Dana P.

    1993-01-01

    Technical Requirements Analysis and Control Systems/Initial Operating Capability (TRACS/IOC) computer program provides supplemental software tools for analysis, control, and interchange of project requirements so qualified project members have access to pertinent project information, even if in different locations. Enables users to analyze and control requirements, serves as focal point for project requirements, and integrates system supporting efficient and consistent operations. TRACS/IOC is HyperCard stack for use on Macintosh computers running HyperCard 1.2 or later and Oracle 1.2 or later.

  5. C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs

    NASA Astrophysics Data System (ADS)

    Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio

    2017-02-01

    Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.

  6. Adapting Pharmacoeconomics to Shape Efficient Health Systems en Route to UHC - Lessons from Two Continents.

    PubMed

    Miot, Jacqui; Thiede, Michael

    2017-01-01

    Background: Pharmacoeconomics is receiving increasing attention globally as a set of tools ensuring efficient use of resources in health systems, albeit with different applications depending on the contextual, cultural and development stages of each country. The factors guiding design, implementation and optimisation of pharmacoeconomics as a steering tool under the universal health coverage paradigm are explored using case studies of Germany and South Africa. Findings: German social health insurance is subject to the efficiency precept. Pharmaco-regulatory tools reflect the respective framework conditions under which they developed at particular points in time. The institutionalization and integration of pharmacoeconomics into the remit of the Institute for Quality and Efficiency in Health Care occurred only rather recently. The road has not been smooth, requiring political discourse and complex processes of negotiation. Although enshrined in the National Drug Policy, South Africa has had a more fragmented approach to medicine selection and pricing with different policies in private and public sectors. The regulatory reform for use of pharmacoeconomic tools is ongoing and will be further shaped by the introduction of National Health Insurance. Conclusion: A clear vision or framework is essential as the regulatory introduction of pharmacoeconomics is not a single event but rather a growing momentum. The path will always be subject to influences of politics, economics and market forces beyond the healthcare system so delays and modifications to pharmacoeconomic tools are to be expected. Health systems are dynamic and pharmacoeconomic reforms need to be sufficiently flexible to evolve alongside.

  7. XML-based data model and architecture for a knowledge-based grid-enabled problem-solving environment for high-throughput biological imaging.

    PubMed

    Ahmed, Wamiq M; Lenz, Dominik; Liu, Jia; Paul Robinson, J; Ghafoor, Arif

    2008-03-01

    High-throughput biological imaging uses automated imaging devices to collect a large number of microscopic images for analysis of biological systems and validation of scientific hypotheses. Efficient manipulation of these datasets for knowledge discovery requires high-performance computational resources, efficient storage, and automated tools for extracting and sharing such knowledge among different research sites. Newly emerging grid technologies provide powerful means for exploiting the full potential of these imaging techniques. Efficient utilization of grid resources requires the development of knowledge-based tools and services that combine domain knowledge with analysis algorithms. In this paper, we first investigate how grid infrastructure can facilitate high-throughput biological imaging research, and present an architecture for providing knowledge-based grid services for this field. We identify two levels of knowledge-based services. The first level provides tools for extracting spatiotemporal knowledge from image sets and the second level provides high-level knowledge management and reasoning services. We then present cellular imaging markup language, an extensible markup language-based language for modeling of biological images and representation of spatiotemporal knowledge. This scheme can be used for spatiotemporal event composition, matching, and automated knowledge extraction and representation for large biological imaging datasets. We demonstrate the expressive power of this formalism by means of different examples and extensive experimental results.

  8. Design and Analysis of Turbines for Space Applications

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.

  9. High-Speed TCP Testing

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.

    1999-01-01

    Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.

  10. ADAM: analysis of discrete models of biological systems using computer algebra.

    PubMed

    Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard

    2011-07-20

    Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.

  11. The development of tool use: Planning for end-state comfort

    PubMed Central

    Comalli, David M.; Keen, Rachel; Abraham, Evelyn S.; Foo, Victoria J.; Lee, Mei-Hua; Adolph, Karen E.

    2016-01-01

    Some grips on the handle of a tool can be planned based on information directly available in the scene. Other grips, however, must be planned based on the final position of the hand. “End-state comfort” grips require an awkward or uncomfortable initial grip so as to later implement the action comfortably and efficiently. From a cognitive perspective, planning for end-state comfort requires a consistent representation of the entire action sequence, including the latter part, which is not based on information directly available in the scene. Many investigators have found that young children fail to demonstrate planning for end-state comfort and that adult-like performance does not appear until about 12 years of age. In two experiments, we used a hammering task that engaged children in a goal-directed action with multiple steps. We assessed end-state-comfort planning in novel ways by measuring children’s hand choice, grip choice, and tool implementation over multiple trials. The hammering task also uniquely allowed us to assess the efficiency of implementation. We replicated the previous developmental trend in 4-, 8-, and 12-year-old children with our novel task. Most important, our data revealed that 4-year-olds are in a transitional stage with several competing strategies exhibited during a single session. Preschoolers changed their grip within trials and across trials, indicating awareness of errors and a willingness to sacrifice speed for more efficient implementation. The end-state-comfort grip initially competes as one grip type among many, but gradually displaces all others. Children’s sensitivity to and drive for efficiency may motivate this change. PMID:27786531

  12. Enhancing population pharmacokinetic modeling efficiency and quality using an integrated workflow.

    PubMed

    Schmidt, Henning; Radivojevic, Andrijana

    2014-08-01

    Population pharmacokinetic (popPK) analyses are at the core of Pharmacometrics and need to be performed regularly. Although these analyses are relatively standard, a large variability can be observed in both the time (efficiency) and the way they are performed (quality). Main reasons for this variability include the level of experience of a modeler, personal preferences and tools. This paper aims to examine how the process of popPK model building can be supported in order to increase its efficiency and quality. The presented approach to the conduct of popPK analyses is centered around three key components: (1) identification of most common and important popPK model features, (2) required information content and formatting of the data for modeling, and (3) methodology, workflow and workflow supporting tools. This approach has been used in several popPK modeling projects and a documented example is provided in the supplementary material. Efficiency of model building is improved by avoiding repetitive coding and other labor-intensive tasks and by putting the emphasis on a fit-for-purpose model. Quality is improved by ensuring that the workflow and tools are in alignment with a popPK modeling guidance which is established within an organization. The main conclusion of this paper is that workflow based approaches to popPK modeling are feasible and have significant potential to ameliorate its various aspects. However, the implementation of such an approach in a pharmacometric organization requires openness towards innovation and change-the key ingredient for evolution of integrative and quantitative drug development in the pharmaceutical industry.

  13. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  14. Ship Maintenance Processes with Collaborative Product Lifecycle Management and 3D Terrestrial Laser Scanning Tools: Reducing Costs and Increasing Productivity

    DTIC Science & Technology

    2011-09-20

    optimal portfolio point on the efficient frontier, for example, Portfolio B on the chart in Figure A1. Then, by subsequently changing some of the ... optimized portfolio controlling for risk using the IRM methodology and tool suite. Results indicate that both rapid and incremental implementation...Results of the KVA and SD scenario analysis provided the financial information required to forecast an optimized

  15. Reflective Writing for a Better Understanding of Scientific Concepts in High School

    NASA Astrophysics Data System (ADS)

    El-Helou, Joseph; Kalman, Calvin S.

    2018-02-01

    Science teachers can always benefit from efficient tools that help students to engage with the subject and understand it better without significantly adding to the teacher's workload nor requiring too much of class time to manage. Reflective writing is such a low-impact, high-return tool. What follows is an introduction to reflective writing, and more on its usefulness for teachers is given in the last part of this article.

  16. Impact of ICT on Performance of Construction Companies in Slovakia

    NASA Astrophysics Data System (ADS)

    Mesároš, Peter; Mandičák, Tomáš

    2017-10-01

    Information and communication technologies became a part of management tools in modern companies. Construction industry and its participants deal with a serious requirement for processing the huge amount of information on construction projects including design, construction, time and cost parameters, economic efficiency and sustainability. To fulfil this requirement, companies have to use appropriate ICT tools. Aim of the paper is to examine the impact of ICT exploitation on performance of construction companies. The impact of BIM tools, ERP systems and controlling system on cost and profit indicators will be measured on the sample of 85 companies from construction industry in Slovakia. Enterprise size, enterprise ownership and role in construction process will be set as independent variables for statistical analyse. The results will be considered for different groups of companies.

  17. On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.

    PubMed

    Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar

    2015-01-01

    Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.

  18. Options to improve energy efficiency for educational building

    NASA Astrophysics Data System (ADS)

    Jahan, Mafruha

    The cost of energy is a major factor that must be considered for educational facility budget planning purpose. The analysis of energy related issues and options can be complex and requires significant time and detailed effort. One way to facilitate the inclusion of energy option planning in facility planning efforts is to utilize a tool that allows for quick appraisal of the facility energy profile. Once such an appraisal is accomplished, it is then possible to rank energy improvement options consistently with other facility needs and requirements. After an energy efficiency option has been determined to have meaningful value in comparison with other facility planning options, it is then possible to utilize the initial appraisal as the basis for an expanded consideration of additional facility and energy use detail using the same analytic system used for the initial appraisal. This thesis has developed a methodology and an associated analytic model to assist in these tasks and thereby improve the energy efficiency of educational facilities. A detailed energy efficiency and analysis tool is described that utilizes specific university building characteristics such as size, architecture, envelop, lighting, occupancy, thermal design which allows reducing the annual energy consumption. Improving the energy efficiency of various aspects of an educational building's energy performance can be complex and can require significant time and experience to make decisions. The approach developed in this thesis initially assesses the energy design for a university building. This initial appraisal is intended to assist administrators in assessing the potential value of energy efficiency options for their particular facility. Subsequently this scoping design can then be extended as another stage of the model by local facility or planning personnel to add more details and engineering aspects to the initial screening model. This approach can assist university planning efforts to identify the most cost effective combinations of energy efficiency strategies. The model analyzes and compares the payback periods of all proposed Energy Performance Measures (EPMs) to determine which has the greatest potential value.

  19. Development of microsatellite markers in Parthenium ssp.

    USDA-ARS?s Scientific Manuscript database

    Molecular markers provide the most efficient means to study genetic diversity within and among species of a particular genus. In addition, molecular markers can facilitate breeding efforts by providing tools necessary to reduce the time required to obtain recombinant genotypes with improved agricu...

  20. Fermilab computing at the Intensity Frontier

    DOE PAGES

    Group, Craig; Fuess, S.; Gutsche, O.; ...

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

  1. ISS Mini AERCam Radio Frequency (RF) Coverage Analysis Using iCAT Development Tool

    NASA Technical Reports Server (NTRS)

    Bolen, Steve; Vazquez, Luis; Sham, Catherine; Fredrickson, Steven; Fink, Patrick; Cox, Jan; Phan, Chau; Panneton, Robert

    2003-01-01

    The long-term goals of the National Aeronautics and Space Administration's (NASA's) Human Exploration and Development of Space (HEDS) enterprise may require the development of autonomous free-flier (FF) robotic devices to operate within the vicinity of low-Earth orbiting spacecraft to supplement human extravehicular activities (EVAs) in space. Future missions could require external visual inspection of the spacecraft that would be difficult, or dangerous, for humans to perform. Under some circumstance, it may be necessary to employ an un-tethered communications link between the FF and the users. The interactive coverage analysis tool (ICAT) is a software tool that has been developed to perform critical analysis of the communications link performance for a FF operating in the vicinity of the International Space Station (ISS) external environment. The tool allows users to interactively change multiple parameters of the communications link parameters to efficiently perform systems engineering trades on network performance. These trades can be directly translated into design and requirements specifications. This tool significantly reduces the development time in determining a communications network topology by allowing multiple parameters to be changed, and the results of link coverage to be statistically characterized and plotted interactively.

  2. FIST at 5: Looking Back, Looking Ahead

    DTIC Science & Technology

    2011-05-01

    Innovative Problem Solving ( TRIZ ) is a master’s class in design, with a strong em- phasis on simplicity and speed. Altshuller’s TRIZ contradiction matrix...and 40 principles are powerful, elegant, and efficient. They should be required reading across the acquisition com- munity (learn more at triz ...shortcuts. As with any tool, expertise comes from practice. Truly mastering Agile, Lean, TRIZ , or MOSA requires concentrated study, experimentation, and

  3. Helping System Engineers Bridge the Peaks

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Tkachuk, Oksana; Person, Suzette; Biatek, Jason; Whalen, Michael W.; Castle, Joseph; Castle, JosephGundy-Burlet, Karen

    2014-01-01

    In our experience at NASA, system engineers generally follow the Twin Peaks approach when developing safety-critical systems. However, iterations between the peaks require considerable manual, and in some cases duplicate, effort. A significant part of the manual effort stems from the fact that requirements are written in English natural language rather than a formal notation. In this work, we propose an approach that enables system engineers to leverage formal requirements and automated test generation to streamline iterations, effectively "bridging the peaks". The key to the approach is a formal language notation that a) system engineers are comfortable with, b) is supported by a family of automated V&V tools, and c) is semantically rich enough to describe the requirements of interest. We believe the combination of formalizing requirements and providing tool support to automate the iterations will lead to a more efficient Twin Peaks implementation at NASA.

  4. Aerospace Power Systems Design and Analysis (APSDA) Tool

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1998-01-01

    The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.

  5. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  6. Quality Service Analysis and Improvement of Pharmacy Unit of XYZ Hospital Using Value Stream Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Jonny; Nasution, Januar

    2013-06-01

    Value stream mapping is a tool which is needed to let the business leader of XYZ Hospital to see what is actually happening in its business process that have caused longer lead time for self-produced medicines in its pharmacy unit. This problem has triggered many complaints filed by patients. After deploying this tool, the team has come up with the fact that in processing the medicine, pharmacy unit does not have any storage and capsule packing tool and this condition has caused many wasting times in its process. Therefore, the team has proposed to the business leader to procure the required tools in order to shorten its process. This research has resulted in shortened lead time from 45 minutes to 30 minutes as required by the government through Indonesian health ministry with increased %VA (valued added activity) or Process Cycle Efficiency (PCE) from 66% to 68% (considered lean because it is upper than required 30%). This result has proved that the process effectiveness has been increase by the improvement.

  7. The impact of a novel resident leadership training curriculum.

    PubMed

    Awad, Samir S; Hayley, Barbara; Fagan, Shawn P; Berger, David H; Brunicardi, F Charles

    2004-11-01

    Today's complex health care environment coupled with the 80-hour workweek mandate has required that surgical resident team interactions evolve from a military command-and-control style to a collaborative leadership style. A novel educational curriculum was implemented with objectives of training the residents to have the capacity/ability to create and manage powerful teams through alignment, communication, and integrity integral tools to practicing a collaborative leadership style while working 80 hours per week. Specific strategies were as follows: (1) to focus on quality of patient care and service while receiving a high education-to-service ratio, and (2) to maximize efficiency through time management. This article shows that leadership training as part of a resident curriculum can significantly increase a resident's view of leadership in the areas of alignment, communication, and integrity; tools previously shown in business models to be vital for effective and efficient teams. This curriculum, over the course of the surgical residency, can provide residents with the necessary tools to deliver efficient quality of care while working within the 80-hour workweek mandate in a more collaborative style environment.

  8. Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs

    NASA Astrophysics Data System (ADS)

    Pianese, C.; Sorrentino, M.

    2009-08-01

    Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.

  9. Evaluation of the sustainability of contrasted pig farming systems: economy.

    PubMed

    Ilari-Antoine, E; Bonneau, M; Klauke, T N; Gonzàlez, J; Dourmad, J Y; De Greef, K; Houwers, H W J; Fabrega, E; Zimmer, C; Hviid, M; Van der Oever, B; Edwards, S A

    2014-12-01

    The aim of this paper is to present an efficient tool for evaluating the economy part of the sustainability of pig farming systems. The selected tool IDEA was tested on a sample of farms from 15 contrasted systems in Europe. A statistical analysis was carried out to check the capacity of the indicators to illustrate the variability of the population and to analyze which of these indicators contributed the most towards it. The scores obtained for the farms were consistent with the reality of pig production; the variable distribution showed an important variability of the sample. The principal component analysis and cluster analysis separated the sample into five subgroups, in which the six main indicators significantly differed, which underlines the robustness of the tool. The IDEA method was proven to be easily comprehensible, requiring few initial variables and with an efficient benchmarking system; all six indicators contributed to fully describe a varied and contrasted population.

  10. Parallelization of ARC3D with Computer-Aided Tools

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; Hribar, Michelle; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    A series of efforts have been devoted to investigating methods of porting and parallelizing applications quickly and efficiently for new architectures, such as the SCSI Origin 2000 and Cray T3E. This report presents the parallelization of a CFD application, ARC3D, using the computer-aided tools, Cesspools. Steps of parallelizing this code and requirements of achieving better performance are discussed. The generated parallel version has achieved reasonably well performance, for example, having a speedup of 30 for 36 Cray T3E processors. However, this performance could not be obtained without modification of the original serial code. It is suggested that in many cases improving serial code and performing necessary code transformations are important parts for the automated parallelization process although user intervention in many of these parts are still necessary. Nevertheless, development and improvement of useful software tools, such as Cesspools, can help trim down many tedious parallelization details and improve the processing efficiency.

  11. G-CNV: A GPU-Based Tool for Preparing Data to Detect CNVs with Read-Depth Methods.

    PubMed

    Manconi, Andrea; Manca, Emanuele; Moscatelli, Marco; Gnocchi, Matteo; Orro, Alessandro; Armano, Giuliano; Milanesi, Luciano

    2015-01-01

    Copy number variations (CNVs) are the most prevalent types of structural variations (SVs) in the human genome and are involved in a wide range of common human diseases. Different computational methods have been devised to detect this type of SVs and to study how they are implicated in human diseases. Recently, computational methods based on high-throughput sequencing (HTS) are increasingly used. The majority of these methods focus on mapping short-read sequences generated from a donor against a reference genome to detect signatures distinctive of CNVs. In particular, read-depth based methods detect CNVs by analyzing genomic regions with significantly different read-depth from the other ones. The pipeline analysis of these methods consists of four main stages: (i) data preparation, (ii) data normalization, (iii) CNV regions identification, and (iv) copy number estimation. However, available tools do not support most of the operations required at the first two stages of this pipeline. Typically, they start the analysis by building the read-depth signal from pre-processed alignments. Therefore, third-party tools must be used to perform most of the preliminary operations required to build the read-depth signal. These data-intensive operations can be efficiently parallelized on graphics processing units (GPUs). In this article, we present G-CNV, a GPU-based tool devised to perform the common operations required at the first two stages of the analysis pipeline. G-CNV is able to filter low-quality read sequences, to mask low-quality nucleotides, to remove adapter sequences, to remove duplicated read sequences, to map the short-reads, to resolve multiple mapping ambiguities, to build the read-depth signal, and to normalize it. G-CNV can be efficiently used as a third-party tool able to prepare data for the subsequent read-depth signal generation and analysis. Moreover, it can also be integrated in CNV detection tools to generate read-depth signals.

  12. Energy efficient engine high-pressure turbine single crystal vane and blade fabrication technology report

    NASA Technical Reports Server (NTRS)

    Giamei, A. F.; Salkeld, R. W.; Hayes, C. W.

    1981-01-01

    The objective of the High-Pressure Turbine Fabrication Program was to demonstrate the application and feasibility of Pratt & Whitney Aircraft-developed two-piece, single crystal casting and bonding technology on the turbine blade and vane configurations required for the high-pressure turbine in the Energy Efficient Engine. During the first phase of the program, casting feasibility was demonstrated. Several blade and vane halves were made for the bonding trials, plus solid blades and vanes were successfully cast for materials evaluation tests. Specimens exhibited the required microstructure and chemical composition. Bonding feasibility was demonstrated in the second phase of the effort. Bonding yields of 75 percent for the vane and 30 percent for the blade were achieved, and methods for improving these yield percentages were identified. A bond process was established for PWA 1480 single crystal material which incorporated a transient liquid phase interlayer. Bond properties were substantiated and sensitivities determined. Tooling die materials were identified, and an advanced differential thermal expansion tooling concept was incorporated into the bond process.

  13. Experimental Performance Evaluation of a Supersonic Turbine for Rocket Engine Applications

    NASA Technical Reports Server (NTRS)

    Snellgrove, Lauren M.; Griffin, Lisa W.; Sieja, James P.; Huber, Frank W.

    2003-01-01

    In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis and testing of the turbomachinery is necessary. To support this requirement, a task was developed at NASA Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. These tools were applied to optimize a supersonic turbine design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned-to obtain an increased efficiency. The goal of the demonstration was to increase the total-to- static efficiency of the turbine by eight points over the baseline design. A sub-scale, cold flow test article modeling the final optimized turbine was designed, manufactured, and tested in air at MSFC s Turbine Airflow Facility. Extensive on- and off- design point performance data, steady-state data, and unsteady blade loading data were collected during testing.

  14. The manipulator tool state classification based on inertia forces analysis

    NASA Astrophysics Data System (ADS)

    Gierlak, Piotr

    2018-07-01

    In this article, we discuss the detection of damage to the cutting tool used in robotised light mechanical processing. Continuous monitoring of the state of the tool mounted in the tool holder of the robot is required due to the necessity to save time. The tool is a brush with ceramic fibres used for surface grinding. A typical example of damage to the brush is the breaking of fibres, resulting in a tool imbalance and vibrations at a high rotational speed, e.g. during grinding. This also results in a limited operating surface of the tool and a decrease in the efficiency of processing. While an imbalanced tool is spinning, fictitious forces occur that carry the information regarding the balance of the tool. The forces can be measured using a force sensor located in the end-effector of the robot allowing the assessment of the damage to the brush in an automatized way, devoid of any operator.

  15. BEopt-CA (Ex) -- A Tool for Optimal Integration of EE/DR/ES+PV in Existing California Homes. Cooperative Research and Development Final Report, CRADA Number CRD-11-429

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Craig

    Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less

  16. Development and implementation of clinical trial protocol templates at the National Institute of Allergy and Infectious Diseases.

    PubMed

    Bridge, Heather; Smolskis, Mary; Bianchine, Peter; Dixon, Dennis O; Kelly, Grace; Herpin, Betsey; Tavel, Jorge

    2009-08-01

    A clinical research protocol document must reflect both sound scientific rationale as well as local, national and, when applicable, international regulatory and human subject protections requirements. These requirements originate from a variety of sources, undergo frequent revision and are subject to interpretation. Tools to assist clinical investigators in the production of clinical protocols could facilitate navigating these requirements and ultimately increase the efficiency of clinical research. The National Institute of Allergy and Infectious Diseases (NIAID) developed templates for investigators to serve as the foundation for protocol development. These protocol templates are designed as tools to support investigators in developing clinical protocols. NIAID established a series of working groups to determine how to improve its capacity to conduct clinical research more efficiently and effectively. The Protocol Template Working Group was convened to determine what protocol templates currently existed within NIAID and whether standard NIAID protocol templates should be produced. After review and assessment of existing protocol documents and requirements, the group reached consensus about required and optional content, determined the format and identified methods for distribution as well as education of investigators in the use of these templates. The templates were approved by the NIAID Executive Committee in 2006 and posted as part of the NIAID Clinical Research Toolkit [1] website for broad access. These documents require scheduled revisions to stay current with regulatory and policy changes. The structure of any clinical protocol template, whether comprehensive or specific to a particular study phase, setting or design, affects how it is used by investigators. Each structure presents its own set of advantages and disadvantages. While useful, protocol templates are not stand-alone tools for creating an optimal protocol document, but must be complemented by institutional resources and support. Education and guidance of investigators in the appropriate use of templates is necessary to ensure a complete yet concise protocol document. Due to changing regulatory requirements, clinical protocol templates cannot become static, but require frequent revisions.

  17. Synthetic biology: tools to design microbes for the production of chemicals and fuels.

    PubMed

    Seo, Sang Woo; Yang, Jina; Min, Byung Eun; Jang, Sungho; Lim, Jae Hyung; Lim, Hyun Gyu; Kim, Seong Cheol; Kim, Se Yeon; Jeong, Jun Hong; Jung, Gyoo Yeol

    2013-11-01

    The engineering of biological systems to achieve specific purposes requires design tools that function in a predictable and quantitative manner. Recent advances in the field of synthetic biology, particularly in the programmable control of gene expression at multiple levels of regulation, have increased our ability to efficiently design and optimize biological systems to perform designed tasks. Furthermore, implementation of these designs in biological systems highlights the potential of using these tools to build microbial cell factories for the production of chemicals and fuels. In this paper, we review current developments in the design of tools for controlling gene expression at transcriptional, post-transcriptional and post-translational levels, and consider potential applications of these tools. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. CscoreTool: fast Hi-C compartment analysis at high resolution.

    PubMed

    Zheng, Xiaobin; Zheng, Yixian

    2018-05-01

    The genome-wide chromosome conformation capture (Hi-C) has revealed that the eukaryotic genome can be partitioned into A and B compartments that have distinctive chromatin and transcription features. Current Principle Component Analyses (PCA)-based method for the A/B compartment prediction based on Hi-C data requires substantial CPU time and memory. We report the development of a method, CscoreTool, which enables fast and memory-efficient determination of A/B compartments at high resolution even in datasets with low sequencing depth. https://github.com/scoutzxb/CscoreTool. xzheng@carnegiescience.edu. Supplementary data are available at Bioinformatics online.

  19. Strategies in Interventional Radiology: Formation of an Interdisciplinary Center of Vascular Anomalies - Chances and Challenges for Effective and Efficient Patient Management.

    PubMed

    Sadick, Maliha; Dally, Franz Josef; Schönberg, Stefan O; Stroszczynski, Christian; Wohlgemuth, Walter A

    2017-10-01

    Background  Radiology is an interdisciplinary field dedicated to the diagnosis and treatment of numerous diseases and is involved in the development of multimodal treatment concepts. Method  Interdisciplinary case management, a broad spectrum of diagnostic imaging facilities and dedicated endovascular radiological treatment options are valuable tools that allow radiology to set up an interdisciplinary center for vascular anomalies. Results  Image-based diagnosis combined with endovascular treatment options is an essential tool for the treatment of patients with highly complex vascular diseases. These vascular anomalies can affect numerous parts of the body so that a multidisciplinary treatment approach is required for optimal patient care. Conclusion  This paper discusses the possibilities and challenges regarding effective and efficient patient management in connection with the formation of an interdisciplinary center for vascular anomalies with strengthening of the clinical role of radiologists. Key points   · Vascular anomalies, which include vascular tumors and malformations, are complex to diagnose and treat.. · There are far more patients with vascular anomalies requiring therapy than interdisciplinary centers for vascular anomalies - there is currently a shortage of dedicated interdisciplinary centers for vascular anomalies in Germany that can provide dedicated care for affected patients.. · Radiology includes a broad spectrum of diagnostic and minimally invasive therapeutic tools which allow the formation of an interdisciplinary center for vascular anomalies for effective, efficient and comprehensive patient management.. Citation Format · Sadick M, Dally FJ, Schönberg SO et al. Strategies in Interventional Radiology: Formation of an Interdisciplinary Center of Vascular Anomalies - Chances and Challenges for Effective and Efficient Patient Management. Fortschr Röntgenstr 2017; 189: 957 - 966. © Georg Thieme Verlag KG Stuttgart · New York.

  20. Successes and Challenges of Incompressible Flow Simulation

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin

    2003-01-01

    During the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of CFD discipline. Even though incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient, CFD tools become indispensable in fluid engineering for incompressible and low speed flow. This paper is intended to review some of the successes made possible by advances in computational technologies during the same period, and discuss some of the current challenges.

  1. Electronic Systems for Spacecraft Vehicles: Required EDA Tools

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic

    1999-01-01

    The continuous increase in complexity of electronic systems is making the design and manufacturing of such systems more challenging than ever before. As a result, designers are finding it impossible to design efficient systems without the use of sophisticated Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and lead to a correct by design methodology. This report identifies the EDA tools that would be needed to design, analyze, simulate, and evaluate electronic systems for spacecraft vehicles. In addition, the report presents recommendations to enhance the current JSC electronic design capabilities. This includes cost information and a discussion as to the impact, both positive and negative, of implementing the recommendations.

  2. Strategic Planning Tools for Large-Scale Technology-Based Assessments

    ERIC Educational Resources Information Center

    Koomen, Marten; Zoanetti, Nathan

    2018-01-01

    Education systems are increasingly being called upon to implement new technology-based assessment systems that generate efficiencies, better meet changing stakeholder expectations, or fulfil new assessment purposes. These assessment systems require coordinated organisational effort to implement and can be expensive in time, skill and other…

  3. A study with ESI PAM-STAMP® on the influence of tool deformation on final part quality during a forming process

    NASA Astrophysics Data System (ADS)

    Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David

    2018-05-01

    As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.

  4. Development and Implementation of Clinical Trial Protocol Templates at the National Institute of Allergy and Infectious Diseases

    PubMed Central

    Bridge, Heather; Smolskis, Mary; Bianchine, Peter; Dixon, Dennis O.; Kelly, Grace; Herpin, Betsey; Tavel, Jorge

    2009-01-01

    Background: A clinical research protocol document must reflect both sound scientific rationale as well as local, national and, when applicable, international regulatory and human subject protections requirements. These requirements originate from a variety of sources, undergo frequent revision and are subject to interpretation. Tools to assist clinical investigators in the production of clinical protocols could facilitate navigating these requirements and ultimately increase the efficiency of clinical research. Purpose: The National Institute of Allergy and Infectious Diseases (NIAID) developed templates for investigators to serve as the foundation for protocol development. These protocol templates are designed as tools to support investigators in developing clinical protocols. Methods: NIAID established a series of working groups to determine how to improve its capacity to conduct clinical research more efficiently and effectively. The Protocol Template Working Group was convened to determine what protocol templates currently existed within NIAID and whether standard NIAID protocol templates should be produced. After review and assessment of existing protocol documents and requirements, the group reached consensus about required and optional content, determined the format and identified methods for distribution as well as education of investigators in the use of these templates. Results: The templates were approved by the NIAID Executive Committee in 2006 and posted as part of the NIAID Clinical Research Toolkit[1]website for broad access. These documents require scheduled revisions to stay current with regulatory and policy changes. Limitations: The structure of any clinical protocol template, whether comprehensive or specific to a particular study phase, setting or design, affects how it is used by investigators. Each structure presents its own set of advantages and disadvantages. While useful, protocol templates are not stand-alone tools for creating an optimal protocol document but must be complemented by institutional resources and support. Education and guidance of investigators in the appropriate use of templates is necessary to ensure a complete yet concise protocol document. Due to changing regulatory requirements, clinical protocol templates cannot become static but require frequent revisions. Conclusions: Standard protocol templates that meet applicable regulations can be important tools to assist investigators in the effective conduct of clinical research, but they require dedicated resources and ongoing input from key stakeholders. PMID:19625326

  5. Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.

    2011-01-01

    Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans.

  6. An Efficient Reachability Analysis Algorithm

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Fijany, Amir

    2008-01-01

    A document discusses a new algorithm for generating higher-order dependencies for diagnostic and sensor placement analysis when a system is described with a causal modeling framework. This innovation will be used in diagnostic and sensor optimization and analysis tools. Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in-situ platforms. This algorithm will serve as a power tool for technologies that satisfy a key requirement of autonomous spacecraft, including science instruments and in-situ missions.

  7. Nose-to-tail analysis of an airbreathing hypersonic vehicle using an in-house simplified tool

    NASA Astrophysics Data System (ADS)

    Piscitelli, Filomena; Cutrone, Luigi; Pezzella, Giuseppe; Roncioni, Pietro; Marini, Marco

    2017-07-01

    SPREAD (Scramjet PREliminary Aerothermodynamic Design) is a simplified, in-house method developed by CIRA (Italian Aerospace Research Centre), able to provide a preliminary estimation of the performance of engine/aeroshape for airbreathing configurations. It is especially useful for scramjet engines, for which the strong coupling between the aerothermodynamic (external) and propulsive (internal) flow fields requires real-time screening of several engine/aeroshape configurations and the identification of the most promising one/s with respect to user-defined constraints and requirements. The outcome of this tool defines the base-line configuration for further design analyses with more accurate tools, e.g., CFD simulations and wind tunnel testing. SPREAD tool has been used to perform the nose-to-tail analysis of the LAPCAT-II Mach 8 MR2.4 vehicle configuration. The numerical results demonstrate SPREAD capability to quickly predict reliable values of aero-propulsive balance (i.e., net-thrust) and aerodynamic efficiency in a pre-design phase.

  8. A study of diverse clinical decision support rule authoring environments and requirements for integration

    PubMed Central

    2012-01-01

    Background Efficient rule authoring tools are critical to allow clinical Knowledge Engineers (KEs), Software Engineers (SEs), and Subject Matter Experts (SMEs) to convert medical knowledge into machine executable clinical decision support rules. The goal of this analysis was to identify the critical success factors and challenges of a fully functioning Rule Authoring Environment (RAE) in order to define requirements for a scalable, comprehensive tool to manage enterprise level rules. Methods The authors evaluated RAEs in active use across Partners Healthcare, including enterprise wide, ambulatory only, and system specific tools, with a focus on rule editors for reminder and medication rules. We conducted meetings with users of these RAEs to discuss their general experience and perceived advantages and limitations of these tools. Results While the overall rule authoring process is similar across the 10 separate RAEs, the system capabilities and architecture vary widely. Most current RAEs limit the ability of the clinical decision support (CDS) interventions to be standardized, sharable, interoperable, and extensible. No existing system meets all requirements defined by knowledge management users. Conclusions A successful, scalable, integrated rule authoring environment will need to support a number of key requirements and functions in the areas of knowledge representation, metadata, terminology, authoring collaboration, user interface, integration with electronic health record (EHR) systems, testing, and reporting. PMID:23145874

  9. The microcomputer scientific software series 4: testing prediction accuracy.

    Treesearch

    H. Michael Rauscher

    1986-01-01

    A computer program, ATEST, is described in this combination user's guide / programmer's manual. ATEST provides users with an efficient and convenient tool to test the accuracy of predictors. As input ATEST requires observed-predicted data pairs. The output reports the two components of accuracy, bias and precision.

  10. Intranet 2.0 from a Project Management Perspective

    ERIC Educational Resources Information Center

    Sharpe, Paul A.; Vacek, Rachel E.

    2010-01-01

    Library intranets require flexibility and efficiency and enhance the internal communication and collaborative nature of creating and organizing the institution's information. At the University of Houston Libraries, the focus was on public services, so little attention was given to the intranet--the tool every department relied on for quick access…

  11. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  12. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. OWLing Clinical Data Repositories With the Ontology Web Language

    PubMed Central

    Pastor, Xavier; Lozano, Esther

    2014-01-01

    Background The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. Objective The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. Methods We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Results Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. Conclusions OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems. PMID:25599697

  14. OWLing Clinical Data Repositories With the Ontology Web Language.

    PubMed

    Lozano-Rubí, Raimundo; Pastor, Xavier; Lozano, Esther

    2014-08-01

    The health sciences are based upon information. Clinical information is usually stored and managed by physicians with precarious tools, such as spreadsheets. The biomedical domain is more complex than other domains that have adopted information and communication technologies as pervasive business tools. Moreover, medicine continuously changes its corpus of knowledge because of new discoveries and the rearrangements in the relationships among concepts. This scenario makes it especially difficult to offer good tools to answer the professional needs of researchers and constitutes a barrier that needs innovation to discover useful solutions. The objective was to design and implement a framework for the development of clinical data repositories, capable of facing the continuous change in the biomedicine domain and minimizing the technical knowledge required from final users. We combined knowledge management tools and methodologies with relational technology. We present an ontology-based approach that is flexible and efficient for dealing with complexity and change, integrated with a solid relational storage and a Web graphical user interface. Onto Clinical Research Forms (OntoCRF) is a framework for the definition, modeling, and instantiation of data repositories. It does not need any database design or programming. All required information to define a new project is explicitly stated in ontologies. Moreover, the user interface is built automatically on the fly as Web pages, whereas data are stored in a generic repository. This allows for immediate deployment and population of the database as well as instant online availability of any modification. OntoCRF is a complete framework to build data repositories with a solid relational storage. Driven by ontologies, OntoCRF is more flexible and efficient to deal with complexity and change than traditional systems and does not require very skilled technical people facilitating the engineering of clinical software systems.

  15. Coupling the Multizone Airflow and Contaminant Transport Software CONTAM with EnergyPlus Using Co-Simulation.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-08-01

    Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.

  16. Advanced control design for hybrid turboelectric vehicle

    NASA Technical Reports Server (NTRS)

    Abban, Joseph; Norvell, Johnesta; Momoh, James A.

    1995-01-01

    The new environment standards are a challenge and opportunity for industry and government who manufacture and operate urban mass transient vehicles. A research investigation to provide control scheme for efficient power management of the vehicle is in progress. Different design requirements using functional analysis and trade studies of alternate power sources and controls have been performed. The design issues include portability, weight and emission/fuel efficiency of induction motor, permanent magnet and battery. A strategic design scheme to manage power requirements using advanced control systems is presented. It exploits fuzzy logic, technology and rule based decision support scheme. The benefits of our study will enhance the economic and technical feasibility of technological needs to provide low emission/fuel efficient urban mass transit bus. The design team includes undergraduate researchers in our department. Sample results using NASA HTEV simulation tool are presented.

  17. ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra

    PubMed Central

    2011-01-01

    Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817

  18. Passive Bottom Loss Estimation Using Compact Arrays and Autonomous Underwater Vehicles

    DTIC Science & Technology

    2015-09-30

    advances in the technology of autonomous underwater vehicles ( AUV ), make it now possible to envision an efficient, cost effective survey tool for seabed...characterization composed of a short array mounted on an AUV . While AUV mounting would require arrays of length presumably below 2m, the passive...frequency range indicated above, the poor angular resolution of the short arrays required in AUV deployment causes an underestimation of the loss

  19. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project.

    PubMed

    Portalés, Cristina; Casas, Sergio; Gimeno, Jesús; Fernández, Marcos; Poza, Montse

    2018-04-19

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes.

  20. From the Paper to the Tablet: On the Design of an AR-Based Tool for the Inspection of Pre-Fab Buildings. Preliminary Results of the SIRAE Project

    PubMed Central

    Fernández, Marcos; Poza, Montse

    2018-01-01

    Energy-efficient Buildings (EeB) are demanded in today’s constructions, fulfilling the requirements for green cities. Pre-fab buildings, which are modularly fully-built in factories, are a good example of this. Although this kind of building is quite new, the in situ inspection is documented using traditional tools, mainly based on paper annotations. Thus, the inspection process is not taking advantage of new technologies. In this paper, we present the preliminary results of the SIRAE project that aims to provide an Augmented Reality (AR) tool that can seamlessly aid in the regular processes of pre-fab building inspections to detect and eliminate the possible existing quality and energy efficiency deviations. In this regards, we show a description of the current inspection process and how an interactive tool can be designed and adapted to it. Our first results show the design and implementation of our tool, which is highly interactive and involves AR visualizations and 3D data-gathering, allowing the inspectors to quickly manage it without altering the way the inspection process is done. First trials on a real environment show that the tool is promising for massive inspection processes. PMID:29671799

  1. Concurrent access to a virtual microscope using a web service oriented architecture

    NASA Astrophysics Data System (ADS)

    Corredor, Germán.; Iregui, Marcela; Arias, Viviana; Romero, Eduardo

    2013-11-01

    Virtual microscopy (VM) facilitates visualization and deployment of histopathological virtual slides (VS), a useful tool for education, research and diagnosis. In recent years, it has become popular, yet its use is still limited basically because of the very large sizes of VS, typically of the order of gigabytes. Such volume of data requires efficacious and efficient strategies to access the VS content. In an educative or research scenario, several users may require to access and interact with VS at the same time, so, due to large data size, a very expensive and powerful infrastructure is usually required. This article introduces a novel JPEG2000-based service oriented architecture for streaming and visualizing very large images under scalable strategies, which in addition need not require very specialized infrastructure. Results suggest that the proposed architecture enables transmission and simultaneous visualization of large images, while it is efficient using resources and offering users proper response times.

  2. Perspectives in astrophysical databases

    NASA Astrophysics Data System (ADS)

    Frailis, Marco; de Angelis, Alessandro; Roberto, Vito

    2004-07-01

    Astrophysics has become a domain extremely rich of scientific data. Data mining tools are needed for information extraction from such large data sets. This asks for an approach to data management emphasizing the efficiency and simplicity of data access; efficiency is obtained using multidimensional access methods and simplicity is achieved by properly handling metadata. Moreover, clustering and classification techniques on large data sets pose additional requirements in terms of computation and memory scalability and interpretability of results. In this study we review some possible solutions.

  3. Improving health, safety and energy efficiency in New Zealand through measuring and applying basic housing standards.

    PubMed

    Gillespie-Bennett, Julie; Keall, Michael; Howden-Chapman, Philippa; Baker, Michael G

    2013-08-02

    Substandard housing is a problem in New Zealand. Historically there has been little recognition of the important aspects of housing quality that affect people's health and safety. In this viewpoint article we outline the importance of assessing these factors as an essential step to improving the health and safety of New Zealanders and household energy efficiency. A practical risk assessment tool adapted to New Zealand conditions, the Healthy Housing Index (HHI), measures the physical characteristics of houses that affect the health and safety of the occupants. This instrument is also the only tool that has been validated against health and safety outcomes and reported in the international peer-reviewed literature. The HHI provides a framework on which a housing warrant of fitness (WOF) can be based. The HHI inspection takes about one hour to conduct and is performed by a trained building inspector. To maximise the effectiveness of this housing quality assessment we envisage the output having two parts. The first would be a pass/fail WOF assessment showing whether or not the house meets basic health, safety and energy efficiency standards. The second component would rate each main assessment area (health, safety and energy efficiency), potentially on a five-point scale. This WOF system would establish a good minimum standard for rental accommodation as well encouraging improved housing performance over time. In this article we argue that the HHI is an important, validated, housing assessment tool that will improve housing quality, leading to better health of the occupants, reduced home injuries, and greater energy efficiency. If required, this tool could be extended to also cover resilience to natural hazards, broader aspects of sustainability, and the suitability of the dwelling for occupants with particular needs.

  4. PHYSICO2: an UNIX based standalone procedure for computation of physicochemical, window-dependent and substitution based evolutionary properties of protein sequences along with automated block preparation tool, version 2.

    PubMed

    Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K

    2015-01-01

    Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users.

  5. PHYSICO2: an UNIX based standalone procedure for computation of physicochemical, window-dependent and substitution based evolutionary properties of protein sequences along with automated block preparation tool, version 2

    PubMed Central

    Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K

    2015-01-01

    Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. Availability PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users. PMID:26339154

  6. Friction Stir Welding Development at NASA, Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Gentz, Steve (Technical Monitor)

    2001-01-01

    Friction stir welding (FSW) is a solid state process that pan be used to join materials without melting. The process was invented by The Welding Institute (TWI), Cambridge, England. Friction stir welding exhibits several advantages over fusion welding in that it produces welds with fewer defects and higher joint efficiency and is capable of joining alloys that are generally considered non-weldable with a fusion weld process. In 1994, NASA-Marshall began collaborating with TWI to transform FSW from a laboratory curiosity to a viable metal joining process suitable for manufacturing hardware. While teamed with TWI, NASA-Marshall began its own FSW research and development effort to investigate possible aerospace applications for the FSW process. The work involved nearly all aspects of FSW development, including process modeling, scale-up issues, applications to advanced materials and development of tooling to use FSW on components of the Space Shuttle with particular emphasis on aluminum tanks. The friction stir welding process involves spinning a pin-tool at an appropriate speed, plunging it into the base metal pieces to be joined, and then translating it along the joint of the work pieces. In aluminum alloys the rotating speed typically ranges from 200 to 400 revolutions per minute and the translation speed is approximately two to five inches per minute. The pin-tool is inserted at a small lead angle from the axis normal to the work piece and requires significant loading along the axis of the tool. An anvil or reaction structure is required behind the welded material to react the load along the axis of the pin tool. The process requires no external heat input, filler material, protective shielding gas or inert atmosphere typical of fusion weld processes. The FSW solid-state weld process has resulted in aluminum welds with significantly higher strengths, higher joint efficiencies and fewer defects than fusion welds used to join similar alloys.

  7. Stackfile Database

    NASA Technical Reports Server (NTRS)

    deVarvalho, Robert; Desai, Shailen D.; Haines, Bruce J.; Kruizinga, Gerhard L.; Gilmer, Christopher

    2013-01-01

    This software provides storage retrieval and analysis functionality for managing satellite altimetry data. It improves the efficiency and analysis capabilities of existing database software with improved flexibility and documentation. It offers flexibility in the type of data that can be stored. There is efficient retrieval either across the spatial domain or the time domain. Built-in analysis tools are provided for frequently performed altimetry tasks. This software package is used for storing and manipulating satellite measurement data. It was developed with a focus on handling the requirements of repeat-track altimetry missions such as Topex and Jason. It was, however, designed to work with a wide variety of satellite measurement data [e.g., Gravity Recovery And Climate Experiment -- GRACE). The software consists of several command-line tools for importing, retrieving, and analyzing satellite measurement data.

  8. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  9. 3D Slicer as a tool for interactive brain tumor segmentation.

    PubMed

    Kikinis, Ron; Pieper, Steve

    2011-01-01

    User interaction is required for reliable segmentation of brain tumors in clinical practice and in clinical research. By incorporating current research tools, 3D Slicer provides a set of interactive, easy to use tools that can be efficiently used for this purpose. One of the modules of 3D Slicer is an interactive editor tool, which contains a variety of interactive segmentation effects. Use of these effects for fast and reproducible segmentation of a single glioblastoma from magnetic resonance imaging data is demonstrated. The innovation in this work lies not in the algorithm, but in the accessibility of the algorithm because of its integration into a software platform that is practical for research in a clinical setting.

  10. Chemical annotation of small and peptide-like molecules at the Protein Data Bank

    PubMed Central

    Young, Jasmine Y.; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M.

    2013-01-01

    Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org PMID:24291661

  11. Efficient utilization of graphics technology for space animation

    NASA Technical Reports Server (NTRS)

    Panos, Gregory Peter

    1989-01-01

    Efficient utilization of computer graphics technology has become a major investment in the work of aerospace engineers and mission designers. These new tools are having a significant impact in the development and analysis of complex tasks and procedures which must be prepared prior to actual space flight. Design and implementation of useful methods in applying these tools has evolved into a complex interaction of hardware, software, network, video and various user interfaces. Because few people can understand every aspect of this broad mix of technology, many specialists are required to build, train, maintain and adapt these tools to changing user needs. Researchers have set out to create systems where an engineering designer can easily work to achieve goals with a minimum of technological distraction. This was accomplished with high-performance flight simulation visual systems and supercomputer computational horsepower. Control throughout the creative process is judiciously applied while maintaining generality and ease of use to accommodate a wide variety of engineering needs.

  12. Chemical annotation of small and peptide-like molecules at the Protein Data Bank.

    PubMed

    Young, Jasmine Y; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M

    2013-01-01

    Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org.

  13. MoManI: a tool to facilitate research, analysis, and teaching of computer models

    NASA Astrophysics Data System (ADS)

    Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo

    2017-04-01

    Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel structure of MoManI allows different teams to collaborate simultaneously from around the globe. Each user can easily edit and update any part of the modelling process: from the underlying mathematical equations of OSeMOSYS through to the visualization of results. Going forward, this tools' flexible structure will make it a potential interface for a larger selection of modelling tools, thus extending its use from OSeMOSYS for energy to other systems modelling, moving beyond SDG7 to others.

  14. Unified Lambert Tool for Massively Parallel Applications in Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Woollands, Robyn M.; Read, Julie; Hernandez, Kevin; Probe, Austin; Junkins, John L.

    2018-03-01

    This paper introduces a parallel-compiled tool that combines several of our recently developed methods for solving the perturbed Lambert problem using modified Chebyshev-Picard iteration. This tool (unified Lambert tool) consists of four individual algorithms, each of which is unique and better suited for solving a particular type of orbit transfer. The first is a Keplerian Lambert solver, which is used to provide a good initial guess (warm start) for solving the perturbed problem. It is also used to determine the appropriate algorithm to call for solving the perturbed problem. The arc length or true anomaly angle spanned by the transfer trajectory is the parameter that governs the automated selection of the appropriate perturbed algorithm, and is based on the respective algorithm convergence characteristics. The second algorithm solves the perturbed Lambert problem using the modified Chebyshev-Picard iteration two-point boundary value solver. This algorithm does not require a Newton-like shooting method and is the most efficient of the perturbed solvers presented herein, however the domain of convergence is limited to about a third of an orbit and is dependent on eccentricity. The third algorithm extends the domain of convergence of the modified Chebyshev-Picard iteration two-point boundary value solver to about 90% of an orbit, through regularization with the Kustaanheimo-Stiefel transformation. This is the second most efficient of the perturbed set of algorithms. The fourth algorithm uses the method of particular solutions and the modified Chebyshev-Picard iteration initial value solver for solving multiple revolution perturbed transfers. This method does require "shooting" but differs from Newton-like shooting methods in that it does not require propagation of a state transition matrix. The unified Lambert tool makes use of the General Mission Analysis Tool and we use it to compute thousands of perturbed Lambert trajectories in parallel on the Space Situational Awareness computer cluster at the LASR Lab, Texas A&M University. We demonstrate the power of our tool by solving a highly parallel example problem, that is the generation of extremal field maps for optimal spacecraft rendezvous (and eventual orbit debris removal). In addition we demonstrate the need for including perturbative effects in simulations for satellite tracking or data association. The unified Lambert tool is ideal for but not limited to space situational awareness applications.

  15. Computer-aided resource planning and scheduling for radiological services

    NASA Astrophysics Data System (ADS)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  16. Web Applications That Promote Learning Communities in Today's Online Classrooms

    ERIC Educational Resources Information Center

    Reigle, Rosemary R.

    2015-01-01

    The changing online learning environment requires that instructors depend less on the standard tools built into most educational learning platforms and turn their focus to use of Open Educational Resources (OERs) and free or low-cost commercial applications. These applications permit new and more efficient ways to build online learning communities…

  17. Supporting Blended-Learning: Tool Requirements and Solutions with OWLish

    ERIC Educational Resources Information Center

    Álvarez, Ainhoa; Martín, Maite; Fernández-Castro, Isabel; Urretavizcaya, Maite

    2016-01-01

    Currently, most of the educational approaches applied to higher education combine face-to-face (F2F) and computer-mediated instruction in a Blended-Learning (B-Learning) approach. One of the main challenges of these approaches is fully integrating the traditional brick-and-mortar classes with online learning environments in an efficient and…

  18. Bridging Archival Standards: Building Software to Translate Metadata Between PDS3 and PDS4

    NASA Astrophysics Data System (ADS)

    De Cesare, C. M.; Padams, J. H.

    2018-04-01

    Transitioning datasets from PDS3 to PDS4 requires manual and detail-oriented work. To increase efficiency and reduce human error, we've built the Label Mapping Tool, which compares a PDS3 label to a PDS4 label template and outputs mappings between the two.

  19. Production Control for a C and C Company

    DTIC Science & Technology

    1975-06-06

    efficiently as they might be. ^his sytem does use TAMMS forme and, therefore, does not burden itself with unnecesssry adalnistratire requirements...on the way to the final objective. i^^^^ääM. ■-■ - - ■^^« aBM ^i.^M^^^^ APPENDIX D Ü.S, ARMY PRODUCTION CONTROL TOOLS The detailed explanation of

  20. Real-Time Aerodynamic Flow and Data Visualization in an Interactive Virtual Environment

    NASA Technical Reports Server (NTRS)

    Schwartz, Richard J.; Fleming, Gary A.

    2005-01-01

    Significant advances have been made to non-intrusive flow field diagnostics in the past decade. Camera based techniques are now capable of determining physical qualities such as surface deformation, surface pressure and temperature, flow velocities, and molecular species concentration. In each case, extracting the pertinent information from the large volume of acquired data requires powerful and efficient data visualization tools. The additional requirement for real time visualization is fueled by an increased emphasis on minimizing test time in expensive facilities. This paper will address a capability titled LiveView3D, which is the first step in the development phase of an in depth, real time data visualization and analysis tool for use in aerospace testing facilities.

  1. Parallel Computation of Unsteady Flows on a Network of Workstations

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Parallel computation of unsteady flows requires significant computational resources. The utilization of a network of workstations seems an efficient solution to the problem where large problems can be treated at a reasonable cost. This approach requires the solution of several problems: 1) the partitioning and distribution of the problem over a network of workstation, 2) efficient communication tools, 3) managing the system efficiently for a given problem. Of course, there is the question of the efficiency of any given numerical algorithm to such a computing system. NPARC code was chosen as a sample for the application. For the explicit version of the NPARC code both two- and three-dimensional problems were studied. Again both steady and unsteady problems were investigated. The issues studied as a part of the research program were: 1) how to distribute the data between the workstations, 2) how to compute and how to communicate at each node efficiently, 3) how to balance the load distribution. In the following, a summary of these activities is presented. Details of the work have been presented and published as referenced.

  2. Sustainable cooling method for machining titanium alloy

    NASA Astrophysics Data System (ADS)

    Boswell, B.; Islam, M. N.

    2016-02-01

    Hard to machine materials such as Titanium Alloy TI-6AI-4V Grade 5 are notoriously known to generate high temperatures and adverse reactions between the workpiece and the tool tip materials. These conditions all contribute to an increase in the wear mechanisms, reducing tool life. Titanium Alloy, for example always requires coolant to be used during machining. However, traditional flood cooling needs to be replaced due to environmental issues, and an alternative cooling method found that has minimum impact on the environment. For true sustainable cooling of the tool it is necessary to account for all energy used in the cooling process, including the energy involved in producing the coolant. Previous research has established that efficient cooling of the tool interface improves the tool life and cutting action. The objective of this research is to determine the most appropriate sustainable cooling method that can also reduce the rate of wear at the tool interface.

  3. Recipe for Success: Digital Viewables

    NASA Technical Reports Server (NTRS)

    LaPha, Steven; Gaydos, Frank

    2014-01-01

    The Engineering Services Contract (ESC) and Information Management Communication Support contract (IMCS) at Kennedy Space Center (KSC) provide services to NASA in respect to flight and ground systems design and development. These groups provides the necessary tools, aid, and best practice methodologies required for efficient, optimized design and process development. The team is responsible for configuring and implementing systems, software, along with training, documentation, and administering standards. The team supports over 200 engineers and design specialists with the use of Windchill, Creo Parametric, NX, AutoCAD, and a variety of other design and analysis tools.

  4. Twelve tips for blueprinting.

    PubMed

    Coderre, Sylvain; Woloschuk, Wayne; McLaughlin, Kevin

    2009-04-01

    Content validity is a requirement of every evaluation and is achieved when the evaluation content is congruent with the learning objectives and the learning experiences. Congruence between these three pillars of education can be facilitated by blueprinting. Here we describe an efficient process for creating a blueprint and explain how to use this tool to guide all aspects of course creation and evaluation. A well constructed blueprint is a valuable tool for medical educators. In addition to validating evaluation content, a blueprint can also be used to guide selection of curricular content and learning experiences.

  5. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  6. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    NASA Technical Reports Server (NTRS)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  7. Light-weight Parallel Python Tools for Earth System Modeling Workflows

    NASA Astrophysics Data System (ADS)

    Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.

    2015-12-01

    With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.

  8. It is time to improve the quality of medical information distributed to students across social media.

    PubMed

    Zucker, Benjamin E; Kontovounisios, Christos

    2018-01-01

    The ubiquitous nature of social media has meant that its effects on fields outside of social communication have begun to be felt. The generation undergoing medical education are of the generation referred to as "digital natives", and as such routinely incorporate social media into their education. Social media's incorporation into medical education includes its use as a platform to distribute information to the public ("distributive education") and as a platform to provide information to a specific audience ("push education"). These functions have proved beneficial in many regards, such as enabling constant access to the subject matter, other learners, and educators. However, the usefulness of using social media as part of medical education is limited by the vast quantities of poor quality information and the time required to find information of sufficient quality and relevance, a problem confounded by many student's preoccupation with "efficient" learning. In this Perspective, the authors discuss whether social media has proved useful as a tool for medical education. The current growth in the use of social media as a tool for medical education seems to be principally supported by students' desire for efficient learning rather than by the efficacy of social media as a resource for medical education. Therefore, improvements in the quality of information required to maximize the impact of social media as a tool for medical education are required. Suggested improvements include an increase in the amount of educational content distributed on social media produced by academic institutions, such as universities and journals.

  9. Spectral analysis for GNSS coordinate time series using chirp Fourier transform

    NASA Astrophysics Data System (ADS)

    Feng, Shengtao; Bo, Wanju; Ma, Qingzun; Wang, Zifan

    2017-12-01

    Spectral analysis for global navigation satellite system (GNSS) coordinate time series provides a principal tool to understand the intrinsic mechanism that affects tectonic movements. Spectral analysis methods such as the fast Fourier transform, Lomb-Scargle spectrum, evolutionary power spectrum, wavelet power spectrum, etc. are used to find periodic characteristics in time series. Among spectral analysis methods, the chirp Fourier transform (CFT) with less stringent requirements is tested with synthetic and actual GNSS coordinate time series, which proves the accuracy and efficiency of the method. With the length of series only limited to even numbers, CFT provides a convenient tool for windowed spectral analysis. The results of ideal synthetic data prove CFT accurate and efficient, while the results of actual data show that CFT is usable to derive periodic information from GNSS coordinate time series.

  10. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    PubMed

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  11. Toolpath Strategy and Optimum Combination of Machining Parameter during Pocket Mill Process of Plastic Mold Steels Material

    NASA Astrophysics Data System (ADS)

    Wibowo, Y. T.; Baskoro, S. Y.; Manurung, V. A. T.

    2018-02-01

    Plastic based products spread all over the world in many aspects of life. The ability to substitute other materials is getting stronger and wider. The use of plastic materials increases and become unavoidable. Plastic based mass production requires injection process as well Mold. The milling process of plastic mold steel material was done using HSS End Mill cutting tool that is widely used in a small and medium enterprise for the reason of its ability to be re sharpened and relatively inexpensive. Study on the effect of the geometry tool states that it has an important effect on the quality improvement. Cutting speed, feed rate, depth of cut and radii are input parameters beside to the tool path strategy. This paper aims to investigate input parameter and cutting tools behaviors within some different tool path strategy. For the reason of experiments efficiency Taguchi method and ANOVA were used. Response studied is surface roughness and cutting behaviors. By achieving the expected quality, no more additional process is required. Finally, the optimal combination of machining parameters will deliver the expected roughness and of course totally reduced cutting time. However actually, SMEs do not optimally use this data for cost reduction.

  12. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  13. BamTools: a C++ API and toolkit for analyzing and managing BAM files.

    PubMed

    Barnett, Derek W; Garrison, Erik K; Quinlan, Aaron R; Strömberg, Michael P; Marth, Gabor T

    2011-06-15

    Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools.

  14. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  15. Shuttle's 160 hour ground turnaround - A design driver

    NASA Technical Reports Server (NTRS)

    Widick, F.

    1977-01-01

    Turnaround analysis added a new dimension to the Space Program with the advent of the Space Shuttle. The requirement to turn the flight hardware around in 160 working hours from landing to launch was a significant design driver and a useful tool in forcing the integration of flight and ground systems design to permit an efficient ground operation. Although there was concern that time constraints might increase program costs, the result of the analysis was to minimize facility requirements and simplify operations with resultant cost savings.

  16. Near-Infrared Neuroimaging with NinPy

    PubMed Central

    Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas

    2009-01-01

    There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449

  17. OPPL-Galaxy, a Galaxy tool for enhancing ontology exploitation as part of bioinformatics workflows

    PubMed Central

    2013-01-01

    Background Biomedical ontologies are key elements for building up the Life Sciences Semantic Web. Reusing and building biomedical ontologies requires flexible and versatile tools to manipulate them efficiently, in particular for enriching their axiomatic content. The Ontology Pre Processor Language (OPPL) is an OWL-based language for automating the changes to be performed in an ontology. OPPL augments the ontologists’ toolbox by providing a more efficient, and less error-prone, mechanism for enriching a biomedical ontology than that obtained by a manual treatment. Results We present OPPL-Galaxy, a wrapper for using OPPL within Galaxy. The functionality delivered by OPPL (i.e. automated ontology manipulation) can be combined with the tools and workflows devised within the Galaxy framework, resulting in an enhancement of OPPL. Use cases are provided in order to demonstrate OPPL-Galaxy’s capability for enriching, modifying and querying biomedical ontologies. Conclusions Coupling OPPL-Galaxy with other bioinformatics tools of the Galaxy framework results in a system that is more than the sum of its parts. OPPL-Galaxy opens a new dimension of analyses and exploitation of biomedical ontologies, including automated reasoning, paving the way towards advanced biological data analyses. PMID:23286517

  18. Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.

    PubMed

    Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.

  19. Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics

    PubMed Central

    Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun

    2014-01-01

    While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285

  20. Tablet PCs in a Paperless Classroom: Student and Teacher Perceptions on Screen Size

    ERIC Educational Resources Information Center

    Runnels, Judith; Rutson-Griffiths, Arthur

    2013-01-01

    A paperless classroom, when all materials required to complete a class are available in an electronic form, has been shown to have positive impacts on student and teacher motivation, engagement, productivity, and efficiency. Recent trends suggest that of all of the technological tools available, tablet PCs can support many aspects of a paperless…

  1. Development of Guidance for States Transitioning to New Safety Analysis Tools

    ERIC Educational Resources Information Center

    Alluri, Priyanka

    2010-01-01

    With about 125 people dying on US roads each day, the US Department of Transportation heightened the awareness of critical safety issues with the passage of SAFETEA-LU (Safe Accountable Flexible Efficient Transportation Equity Act--A Legacy for Users) legislation in 2005. The legislation required each of the states to develop a Strategic Highway…

  2. A data envelope analysis to assess factors affecting technical and economic efficiency of individual broiler breeder hens.

    PubMed

    Romero, L F; Zuidhof, M J; Jeffrey, S R; Naeima, A; Renema, R A; Robinson, F E

    2010-08-01

    This study evaluated the effect of feed allocation and energetic efficiency on technical and economic efficiency of broiler breeder hens using the data envelope analysis methodology and quantified the effect of variables affecting technical efficiency. A total of 288 Ross 708 pullets were placed in individual cages at 16 wk of age and assigned to 1 of 4 feed allocation groups. Three of them had feed allocated on a group basis with divergent BW targets: standard, high (standard x 1.1), and low (standard x 0.9). The fourth group had feed allocated on an individual bird basis following the standard BW target. Birds were classified in 3 energetic efficiency categories: low, average, and high, based on estimated maintenance requirements. Technical efficiency considered saleable chicks as output and cumulative ME intake and time as inputs. Economic efficiency of feed allocation treatments was analyzed under different cost scenarios. Birds with low feed allocation exhibited a lower technical efficiency (69.4%) than standard (72.1%), which reflected a reduced egg production rate. Feed allocation of the high treatment could have been reduced by 10% with the same chick production as the standard treatment. The low treatment exhibited reduced economic efficiency at greater capital costs, whereas high had reduced economic efficiency at greater feed costs. The average energetic efficiency hens had a lower technical efficiency in the low compared with the standard feed allocation. A 1% increment in estimated maintenance requirement changed technical efficiency by -0.23%, whereas a 1% increment in ME intake had a -0.47% effect. The negative relationship between technical efficiency and ME intake was counterbalanced by a positive correlation of ME intake and egg production. The negative relationship of technical efficiency and maintenance requirements was synergized by a negative correlation of hen maintenance and egg production. Economic efficiency methodologies are effective tools to assess the economic effect of selection and flock management programs because biological, allocative, and economic factors can be independently analyzed.

  3. Effect of multiple forming tools on geometrical and mechanical properties in incremental sheet forming

    NASA Astrophysics Data System (ADS)

    Wernicke, S.; Dang, T.; Gies, S.; Tekkaya, A. E.

    2018-05-01

    The tendency to a higher variety of products requires economical manufacturing processes suitable for the production of prototypes and small batches. In the case of complex hollow-shaped parts, single point incremental forming (SPIF) represents a highly flexible process. The flexibility of this process comes along with a very long process time. To decrease the process time, a new incremental forming approach with multiple forming tools is investigated. The influence of two incremental forming tools on the resulting mechanical and geometrical component properties compared to SPIF is presented. Sheets made of EN AW-1050A were formed to frustums of a pyramid using different tool-path strategies. Furthermore, several variations of the tool-path strategy are analyzed. A time saving between 40% and 60% was observed depending on the tool-path and the radii of the forming tools while the mechanical properties remained unchanged. This knowledge can increase the cost efficiency of incremental forming processes.

  4. Induction Consolidation of Thermoplastic Composites Using Smart Susceptors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsen, Marc R

    2012-06-14

    This project has focused on the area of energy efficient consolidation and molding of fiber reinforced thermoplastic composite components as an energy efficient alternative to the conventional processing methods such as autoclave processing. The expanding application of composite materials in wind energy, automotive, and aerospace provides an attractive energy efficiency target for process development. The intent is to have this efficient processing along with the recyclable thermoplastic materials ready for large scale application before these high production volume levels are reached. Therefore, the process can be implemented in a timely manner to realize the maximum economic, energy, and environmental efficiencies.more » Under this project an increased understanding of the use of induction heating with smart susceptors applied to consolidation of thermoplastic has been achieved. This was done by the establishment of processing equipment and tooling and the subsequent demonstration of this fabrication technology by consolidating/molding of entry level components for each of the participating industrial segments, wind energy, aerospace, and automotive. This understanding adds to the nation's capability to affordably manufacture high quality lightweight high performance components from advanced recyclable composite materials in a lean and energy efficient manner. The use of induction heating with smart susceptors is a precisely controlled low energy method for the consolidation and molding of thermoplastic composites. The smart susceptor provides intrinsic thermal control based on the interaction with the magnetic field from the induction coil thereby producing highly repeatable processing. The low energy usage is enabled by the fact that only the smart susceptor surface of the tool is heated, not the entire tool. Therefore much less mass is heated resulting in significantly less required energy to consolidate/mold the desired composite components. This energy efficiency results in potential energy savings of {approx}75% as compared to autoclave processing in aerospace, {approx}63% as compared to compression molding in automotive, and {approx}42% energy savings as compared to convectively heated tools in wind energy. The ability to make parts in a rapid and controlled manner provides significant economic advantages for each of the industrial segments. These attributes were demonstrated during the processing of the demonstration components on this project.« less

  5. Force feedback requirements for efficient laparoscopic grasp control.

    PubMed

    Westebring-van der Putten, Eleonora P; van den Dobbelsteen, John J; Goossens, Richard H M; Jakimowicz, Jack J; Dankelman, Jenny

    2009-09-01

    During laparoscopic grasping, tissue damage may occur due to use of excessive grasp forces and tissue slippage, whereas in barehanded grasping, humans control their grasp to prevent slippage and use of excessive force (safe grasp). This study investigates the differences in grasp control during barehanded and laparoscopic lifts. Ten novices performed lifts in order to compare pinch forces under four conditions: barehanded; using tweezers; a low-efficient grasper; and a high-efficient grasper. Results showed that participants increased their pinch force significantly later during a barehanded lift (at a pull-force level of 2.63 N) than when lifting laparoscopically (from pull-force levels of 0.77 to 1.08 N). In barehanded lifts all participants could accomplish a safe grasp, whereas in laparoscopic lifts excessive force (up to 7.9 N) and slippage (up to 38% of the trials) occurred frequently. For novices, it can be concluded that force feedback (additional to the hand-tool interface), as in skin-tissue contact, is a prerequisite to maintain a safe grasp. Much is known about grasp control during barehanded object manipulation, especially the control of pinch forces to changing loading, whereas little is known about force perception and grasp control during tool usage. This knowledge is a prerequisite for the ergonomic design of tools that are used to manipulate objects.

  6. Automated documentation generator for advanced protein crystal growth

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    To achieve an environment less dependent on the flow of paper, automated techniques of data storage and retrieval must be utilized. This software system, 'Automated Payload Experiment Tool,' seeks to provide a knowledge-based, hypertext environment for the development of NASA documentation. Once developed, the final system should be able to guide a Principal Investigator through the documentation process in a more timely and efficient manner, while supplying more accurate information to the NASA payload developer. The current system is designed for the development of the Science Requirements Document (SRD), the Experiment Requirements Document (ERD), the Project Plan, and the Safety Requirements Document.

  7. Development status of EUV sources for use in beta-tools and high-volume chip manufacturing tools

    NASA Astrophysics Data System (ADS)

    Stamm, U.; Kleinschmidt, J.; Bolshukhin, D.; Brudermann, J.; Hergenhan, G.; Korobotchko, V.; Nikolaus, B.; Schürmann, M. C.; Schriever, G.; Ziener, C.; Borisov, V. M.

    2006-03-01

    In the paper we give an update about the development status of gas discharge produced plasma (GDPP) EUV sources at XTREME technologies. Already in 2003 first commercial prototypes of xenon GDPP sources of the type XTS 13-35 based on the Z-pinch with 35 W power in 2π sr have been delivered and integrated into micro-exposure tools from Exitech, UK. The micro-exposure tools with these sources have been installed in industry in 2004. The first tool has made more than 100 million pulses without visible degradation of the source collector optics. For the next generation of full-field exposure tools (we call it Beta-tools) we develop GDPP sources with power of > 10 W in intermediate focus. Also these sources use xenon as fuel which has the advantage of not introducing additional contaminations. Here we describe basic performance of these sources as well as aspects of collector integration and debris mitigation and optics lifetime. To achieve source performance data required for high volume chip manufacturing we consider tin as fuel for the source because of its higher conversion efficiency compared to xenon. While we had earlier reported an output power of 400 W in 2π sr from a tin source we could reach meanwhile 800 W in 2π sr from the source in burst operation. Provided a high power collector is available with a realistic collector module efficiency of between 9% and 15 % these data would support 70-120 W power in intermediate focus. However, we do not expect that the required duty cycle and the required electrode lifetimes can be met with this standing electrode design Z-pinch approach. To overcome lifetime and duty cycle limitations we have investigated GDPP sources with tin fuel and rotating disk electrodes. Currently we can generate more than 200 W in 2π sr with these sources at 4 kHz repetition rate. To achieve 180 W power in intermediate focus which is the recent requirement of some exposure tool manufacturers this type of source needs to operate at 21-28 kHz repetition rate which may be not possible by various reasons. In order to make operation at reasonable repetition rates with sufficient power possible we have investigated various new excitation concepts of the rotating disk electrode configurations. With one of the concepts pulse energies above 170 mJ in 2π sr could be demonstrated. This approach promises to support 180 W intermediate focus power at repetition rates in the range between 7 and 10 kHz. It will be developed to the next power level in the following phase of XTREME technologies' high volume manufacturing source development program.

  8. Efficiency of Different Sampling Tools for Aquatic Macroinvertebrate Collections in Malaysian Streams

    PubMed Central

    Ghani, Wan Mohd Hafezul Wan Abdul; Rawi, Che Salmah Md; Hamid, Suhaila Abd; Al-Shami, Salman Abdo

    2016-01-01

    This study analyses the sampling performance of three benthic sampling tools commonly used to collect freshwater macroinvertebrates. Efficiency of qualitative D-frame and square aquatic nets were compared to a quantitative Surber sampler in tropical Malaysian streams. The abundance and diversity of macroinvertebrates collected using each tool evaluated along with their relative variations (RVs). Each tool was used to sample macroinvertebrates from three streams draining different areas: a vegetable farm, a tea plantation and a forest reserve. High macroinvertebrate diversities were recorded using the square net and Surber sampler at the forested stream site; however, very low species abundance was recorded by the Surber sampler. Relatively large variations in the Surber sampler collections (RVs of 36% and 28%) were observed for the vegetable farm and tea plantation streams, respectively. Of the three sampling methods, the square net was the most efficient, collecting a greater diversity of macroinvertebrate taxa and a greater number of specimens (i.e., abundance) overall, particularly from the vegetable farm and the tea plantation streams (RV<25%). Fewer square net sample passes (<8 samples) were sufficient to perform a biological assessment of water quality, but each sample required a slightly longer processing time (±20 min) compared with those gathered via the other samplers. In conclusion, all three apparatuses were suitable for macroinvertebrate collection in Malaysian streams and gathered assemblages that resulted in the determination of similar biological water quality classes using the Family Biotic Index (FBI) and the Biological Monitoring Working Party (BMWP). However, despite a slightly longer processing time, the square net was more efficient (lowest RV) at collecting samples and more suitable for the collection of macroinvertebrates from deep, fast flowing, wadeable streams with coarse substrates. PMID:27019685

  9. Image based method for aberration measurement of lithographic tools

    NASA Astrophysics Data System (ADS)

    Xu, Shuang; Tao, Bo; Guo, Yongxing; Li, Gongfa

    2018-01-01

    Information of lens aberration of lithographic tools is important as it directly affects the intensity distribution in the image plane. Zernike polynomials are commonly used for a mathematical description of lens aberrations. Due to the advantage of lower cost and easier implementation of tools, image based measurement techniques have been widely used. Lithographic tools are typically partially coherent systems that can be described by a bilinear model, which entails time consuming calculations and does not lend a simple and intuitive relationship between lens aberrations and the resulted images. Previous methods for retrieving lens aberrations in such partially coherent systems involve through-focus image measurements and time-consuming iterative algorithms. In this work, we propose a method for aberration measurement in lithographic tools, which only requires measuring two images of intensity distribution. Two linear formulations are derived in matrix forms that directly relate the measured images to the unknown Zernike coefficients. Consequently, an efficient non-iterative solution is obtained.

  10. TeraStitcher - A tool for fast automatic 3D-stitching of teravoxel-sized microscopy images

    PubMed Central

    2012-01-01

    Background Further advances in modern microscopy are leading to teravoxel-sized tiled 3D images at high resolution, thus increasing the dimension of the stitching problem of at least two orders of magnitude. The existing software solutions do not seem adequate to address the additional requirements arising from these datasets, such as the minimization of memory usage and the need to process just a small portion of data. Results We propose a free and fully automated 3D Stitching tool designed to match the special requirements coming out of teravoxel-sized tiled microscopy images that is able to stitch them in a reasonable time even on workstations with limited resources. The tool was tested on teravoxel-sized whole mouse brain images with micrometer resolution and it was also compared with the state-of-the-art stitching tools on megavoxel-sized publicy available datasets. This comparison confirmed that the solutions we adopted are suited for stitching very large images and also perform well on datasets with different characteristics. Indeed, some of the algorithms embedded in other stitching tools could be easily integrated in our framework if they turned out to be more effective on other classes of images. To this purpose, we designed a software architecture which separates the strategies that use efficiently memory resources from the algorithms which may depend on the characteristics of the acquired images. Conclusions TeraStitcher is a free tool that enables the stitching of Teravoxel-sized tiled microscopy images even on workstations with relatively limited resources of memory (<8 GB) and processing power. It exploits the knowledge of approximate tile positions and uses ad-hoc strategies and algorithms designed for such very large datasets. The produced images can be saved into a multiresolution representation to be efficiently retrieved and processed. We provide TeraStitcher both as standalone application and as plugin of the free software Vaa3D. PMID:23181553

  11. Evaluation of the Terminal Precision Scheduling and Spacing System for Near-Term NAS Application

    NASA Technical Reports Server (NTRS)

    Thipphavong, Jane; Martin, Lynne Hazel; Swenson, Harry N.; Lin, Paul; Nguyen, Jimmy

    2012-01-01

    NASA has developed a capability for terminal area precision scheduling and spacing (TAPSS) to provide higher capacity and more efficiently manage arrivals during peak demand periods. This advanced technology is NASA's vision for the NextGen terminal metering capability. A set of human-in-the-loop experiments was conducted to evaluate the performance of the TAPSS system for near-term implementation. The experiments evaluated the TAPSS system under the current terminal routing infrastructure to validate operational feasibility. A second goal of the study was to measure the benefit of the Center and TRACON advisory tools to help prioritize the requirements for controller radar display enhancements. Simulation results indicate that using the TAPSS system provides benefits under current operations, supporting a 10% increase in airport throughput. Enhancements to Center decision support tools had limited impact on improving the efficiency of terminal operations, but did provide more fuel-efficient advisories to achieve scheduling conformance within 20 seconds. The TRACON controller decision support tools were found to provide the most benefit, by improving the precision in schedule conformance to within 20 seconds, reducing the number of arrivals having lateral path deviations by 50% and lowering subjective controller workload. Overall, the TAPSS system was found to successfully develop an achievable terminal arrival metering plan that was sustainable under heavy traffic demand levels and reduce the complexity of terminal operations when coupled with the use of the terminal controller advisory tools.

  12. The Java Image Science Toolkit (JIST) for rapid prototyping and publishing of neuroimaging software.

    PubMed

    Lucas, Blake C; Bogovic, John A; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L; Pham, Dzung L; Landman, Bennett A

    2010-03-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC).

  13. The Java Image Science Toolkit (JIST) for Rapid Prototyping and Publishing of Neuroimaging Software

    PubMed Central

    Lucas, Blake C.; Bogovic, John A.; Carass, Aaron; Bazin, Pierre-Louis; Prince, Jerry L.; Pham, Dzung

    2010-01-01

    Non-invasive neuroimaging techniques enable extraordinarily sensitive and specific in vivo study of the structure, functional response and connectivity of biological mechanisms. With these advanced methods comes a heavy reliance on computer-based processing, analysis and interpretation. While the neuroimaging community has produced many excellent academic and commercial tool packages, new tools are often required to interpret new modalities and paradigms. Developing custom tools and ensuring interoperability with existing tools is a significant hurdle. To address these limitations, we present a new framework for algorithm development that implicitly ensures tool interoperability, generates graphical user interfaces, provides advanced batch processing tools, and, most importantly, requires minimal additional programming or computational overhead. Java-based rapid prototyping with this system is an efficient and practical approach to evaluate new algorithms since the proposed system ensures that rapidly constructed prototypes are actually fully-functional processing modules with support for multiple GUI's, a broad range of file formats, and distributed computation. Herein, we demonstrate MRI image processing with the proposed system for cortical surface extraction in large cross-sectional cohorts, provide a system for fully automated diffusion tensor image analysis, and illustrate how the system can be used as a simulation framework for the development of a new image analysis method. The system is released as open source under the Lesser GNU Public License (LGPL) through the Neuroimaging Informatics Tools and Resources Clearinghouse (NITRC). PMID:20077162

  14. Streamlining Collaborative Planning in Spacecraft Mission Architectures

    NASA Technical Reports Server (NTRS)

    Misra, Dhariti; Bopf, Michel; Fishman, Mark; Jones, Jeremy; Kerbel, Uri; Pell, Vince

    2000-01-01

    During the past two decades, the planning and scheduling community has substantially increased the capability and efficiency of individual planning and scheduling systems. Relatively recently, research work to streamline collaboration between planning systems is gaining attention. Spacecraft missions stand to benefit substantially from this work as they require the coordination of multiple planning organizations and planning systems. Up to the present time this coordination has demanded a great deal of human intervention and/or extensive custom software development efforts. This problem will become acute with increased requirements for cross-mission plan coordination and multi -spacecraft mission planning. The Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center is taking innovative steps to define collaborative planning architectures, and to identify coordinated planning tools for Cross-Mission Campaigns. Prototypes are being developed to validate these architectures and assess the usefulness of the coordination tools by the planning community. This presentation will focus on one such planning coordination too], named Visual Observation Layout Tool (VOLT), which is currently being developed to streamline the coordination between astronomical missions

  15. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  16. Predicting the Operational Acceptability of Route Advisories

    NASA Technical Reports Server (NTRS)

    Evans, Antony; Lee, Paul

    2017-01-01

    NASA envisions a future Air Traffic Management system that allows safe, efficient growth in global operations, enabled by increasing levels of automation and autonomy. In a safety-critical system, the introduction of increasing automation and autonomy has to be done in stages, making human-system integrated concepts critical in the foreseeable future. One example where this is relevant is for tools that generate more efficient flight routings or reroute advisories. If these routes are not operationally acceptable, they will be rejected by human operators, and the associated benefits will not be realized. Operational acceptance is therefore required to enable the increased efficiency and reduced workload benefits associated with these tools. In this paper, the authors develop a predictor of operational acceptability for reroute advisories. Such a capability has applications in tools that identify more efficient routings around weather and congestion and that better meet airline preferences. The capability is based on applying data mining techniques to flight plan amendment data reported by the Federal Aviation Administration and data on requested reroutes collected from a field trial of the NASA developed Dynamic Weather Routes tool, which advised efficient route changes to American Airlines dispatchers in 2014. 10-Fold cross validation was used for feature, model and parameter selection, while nested cross validation was used to validate the model. The model performed well in predicting controller acceptance or rejection of a route change as indicated by chosen performance metrics. Features identified as relevant to controller acceptance included the historical usage of the advised route, the location of the maneuver start point relative to the boundaries of the airspace sector containing the maneuver start (the maneuver start sector), the reroute deviation from the original flight plan, and the demand level in the maneuver start sector. A random forest with forty trees was the best performing of the five models evaluated in this paper.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lherbier, Louis, W.; Novotnak, David, J.; Herling, Darrell, R.

    Hot forming processes such as forging, die casting and glass forming require tooling that is subjected to high temperatures during the manufacturing of components. Current tooling is adversely affected by prolonged exposure at high temperatures. Initial studies were conducted to determine the root cause of tool failures in a number of applications. Results show that tool failures vary and depend on the operating environment under which they are used. Major root cause failures include (1) thermal softening, (2) fatigue and (3) tool erosion, all of which are affected by process boundary conditions such as lubrication, cooling, process speed, etc. Whilemore » thermal management is a key to addressing tooling failures, it was clear that new tooling materials with superior high temperature strength could provide improved manufacturing efficiencies. These efficiencies are based on the use of functionally graded materials (FGM), a new subset of hybrid tools with customizable properties that can be fabricated using advanced powder metallurgy manufacturing technologies. Modeling studies of the various hot forming processes helped identify the effect of key variables such as stress, temperature and cooling rate and aid in the selection of tooling materials for specific applications. To address the problem of high temperature strength, several advanced powder metallurgy nickel and cobalt based alloys were selected for evaluation. These materials were manufactured into tooling using two relatively new consolidation processes. One process involved laser powder deposition (LPD) and the second involved a solid state dynamic powder consolidation (SSDPC) process. These processes made possible functionally graded materials (FGM) that resulted in shaped tooling that was monolithic, bi-metallic or substrate coated. Manufacturing of tooling with these processes was determined to be robust and consistent for a variety of materials. Prototype and production testing of FGM tooling showed the benefits of the nickel and cobalt based powder metallurgy alloys in a number of applications evaluated. Improvements in tool life ranged from three (3) to twenty (20) or more times than currently used tooling. Improvements were most dramatic where tool softening and deformation were the major cause of tool failures in hot/warm forging applications. Significant improvement was also noted in erosion of aluminum die casting tooling. Cost and energy savings can be realized as a result of increased tooling life, increased productivity and a reduction in scrap because of improved dimensional controls. Although LPD and SSDPC tooling usually have higher acquisition costs, net tooling costs per component produced drops dramatically with superior tool performance. Less energy is used to manufacture the tooling because fewer tools are required and less recycling of used tools are needed for the hot forming process. Energy is saved during the component manufacturing cycle because more parts can be produced in shorter periods of time. Energy is also saved by minimizing heating furnace idling time because of less downtime for tooling changes.« less

  18. Intelligent Processing Equipment Within the Environmental Protection Agency

    NASA Technical Reports Server (NTRS)

    Greathouse, Daniel G.; Nalesnik, Richard P.

    1992-01-01

    Protection of the environment and environmental remediation requires the cooperation, at all levels, of government and industry. Intelligent processing equipment, in addition to other artificial intelligence based tools, was used by the Environmental Protection Agency to provide personnel safety and improve the efficiency of those responsible for protection and remediation of the environment. These exploratory efforts demonstrate the feasibility and utility of expanding development and widespread use of these tools. A survey of current intelligent processing equipment applications in the Agency is presented and is followed by a brief discussion of possible uses in the future.

  19. Precision injection molding of freeform optics

    NASA Astrophysics Data System (ADS)

    Fang, Fengzhou; Zhang, Nan; Zhang, Xiaodong

    2016-08-01

    Precision injection molding is the most efficient mass production technology for manufacturing plastic optics. Applications of plastic optics in field of imaging, illumination, and concentration demonstrate a variety of complex surface forms, developing from conventional plano and spherical surfaces to aspheric and freeform surfaces. It requires high optical quality with high form accuracy and lower residual stresses, which challenges both optical tool inserts machining and precision injection molding process. The present paper reviews recent progress in mold tool machining and precision injection molding, with more emphasis on precision injection molding. The challenges and future development trend are also discussed.

  20. Understanding Patient Experience Using Internet-based Email Surveys: A Feasibility Study at Mount Sinai Hospital.

    PubMed

    Morgan, Matthew; Lau, Davina; Jivraj, Tanaz; Principi, Tania; Dietrich, Sandra; Bell, Chaim M

    2015-01-01

    Email is becoming a widely accepted communication tool in healthcare settings. This study sought to test the feasibility of Internet-based email surveys of patient experience in the ambulatory setting. We conducted a study of email Internet-based surveys sent to patients in selected ambulatory clinics at Mount Sinai Hospital in Toronto, Canada. Our findings suggest that email links to Internet surveys are a feasible, timely and efficient method to solicit patient feedback about their experience. Further research is required to optimally leverage Internet-based email surveys as a tool to better understand the patient experience.

  1. A microwave applicator for uniform irradiation by circularly polarized waves in an anechoic chamber

    NASA Astrophysics Data System (ADS)

    Chiang, W. Y.; Wu, M. H.; Wu, K. L.; Lin, M. H.; Teng, H. H.; Tsai, Y. F.; Ko, C. C.; Yang, E. C.; Jiang, J. A.; Barnett, L. R.; Chu, K. R.

    2014-08-01

    Microwave applicators are widely employed for materials heating in scientific research and industrial applications, such as food processing, wood drying, ceramic sintering, chemical synthesis, waste treatment, and insect control. For the majority of microwave applicators, materials are heated in the standing waves of a resonant cavity, which can be highly efficient in energy consumption, but often lacks the field uniformity and controllability required for a scientific study. Here, we report a microwave applicator for rapid heating of small samples by highly uniform irradiation. It features an anechoic chamber, a 24-GHz microwave source, and a linear-to-circular polarization converter. With a rather low energy efficiency, such an applicator functions mainly as a research tool. This paper discusses the significance of its special features and describes the structure, in situ diagnostic tools, calculated and measured field patterns, and a preliminary heating test of the overall system.

  2. Monitoring of European corn borer with pheromone-baited traps: review of trapping system basics and remaining problems.

    PubMed

    Laurent, Pélozuelo; Frérot, Brigitte

    2007-12-01

    Since the identification of female European corn borer, Ostrinia nubilalis (Hübner) pheromone, pheromone-baited traps have been regarded as a promising tool to monitor populations of this pest. This article reviews the literature produced on this topic since the 1970s. Its aim is to provide extension entomologists and other researchers with all the necessary information to establish an efficient trapping procedure for this moth. The different pheromone races of the European corn borer are described, and research results relating to the optimization of pheromone blend, pheromone bait, trap design, and trap placement are summarized followed by a state-of-the-art summary of data comparing blacklight trap and pheromone-baited trap techniques to monitor European corn borer flight. Finally, we identify the information required to definitively validate/invalidate the pheromone-baited traps as an efficient decision support tool in European corn borer control.

  3. A microwave applicator for uniform irradiation by circularly polarized waves in an anechoic chamber.

    PubMed

    Chiang, W Y; Wu, M H; Wu, K L; Lin, M H; Teng, H H; Tsai, Y F; Ko, C C; Yang, E C; Jiang, J A; Barnett, L R; Chu, K R

    2014-08-01

    Microwave applicators are widely employed for materials heating in scientific research and industrial applications, such as food processing, wood drying, ceramic sintering, chemical synthesis, waste treatment, and insect control. For the majority of microwave applicators, materials are heated in the standing waves of a resonant cavity, which can be highly efficient in energy consumption, but often lacks the field uniformity and controllability required for a scientific study. Here, we report a microwave applicator for rapid heating of small samples by highly uniform irradiation. It features an anechoic chamber, a 24-GHz microwave source, and a linear-to-circular polarization converter. With a rather low energy efficiency, such an applicator functions mainly as a research tool. This paper discusses the significance of its special features and describes the structure, in situ diagnostic tools, calculated and measured field patterns, and a preliminary heating test of the overall system.

  4. Input-output identification of controlled discrete manufacturing systems

    NASA Astrophysics Data System (ADS)

    Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques

    2014-03-01

    The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.

  5. BamTools: a C++ API and toolkit for analyzing and managing BAM files

    PubMed Central

    Barnett, Derek W.; Garrison, Erik K.; Quinlan, Aaron R.; Strömberg, Michael P.; Marth, Gabor T.

    2011-01-01

    Motivation: Analysis of genomic sequencing data requires efficient, easy-to-use access to alignment results and flexible data management tools (e.g. filtering, merging, sorting, etc.). However, the enormous amount of data produced by current sequencing technologies is typically stored in compressed, binary formats that are not easily handled by the text-based parsers commonly used in bioinformatics research. Results: We introduce a software suite for programmers and end users that facilitates research analysis and data management using BAM files. BamTools provides both the first C++ API publicly available for BAM file support as well as a command-line toolkit. Availability: BamTools was written in C++, and is supported on Linux, Mac OSX and MS Windows. Source code and documentation are freely available at http://github.org/pezmaster31/bamtools. Contact: barnetde@bc.edu PMID:21493652

  6. A model for flexible tools used in minimally invasive medical virtual environments.

    PubMed

    Soler, Francisco; Luzon, M Victoria; Pop, Serban R; Hughes, Chris J; John, Nigel W; Torres, Juan Carlos

    2011-01-01

    Within the limits of current technology, many applications of a virtual environment will trade-off accuracy for speed. This is not an acceptable compromise in a medical training application where both are essential. Efficient algorithms must therefore be developed. The purpose of this project is the development and validation of a novel physics-based real time tool manipulation model, which is easy to integrate into any medical virtual environment that requires support for the insertion of long flexible tools into complex geometries. This encompasses medical specialities such as vascular interventional radiology, endoscopy, and laparoscopy, where training, prototyping of new instruments/tools and mission rehearsal can all be facilitated by using an immersive medical virtual environment. Our model recognises and uses accurately patient specific data and adapts to the geometrical complexity of the vessel in real time.

  7. Pointo - a Low Cost Solution to Point Cloud Processing

    NASA Astrophysics Data System (ADS)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.

  8. A New Wnt1-CRE TomatoRosa Embryonic Stem Cell Line: A Tool for Studying Neural Crest Cell Integration Capacity.

    PubMed

    Acuna-Mendoza, Soledad; Martin, Sabrina; Kuchler-Bopp, Sabine; Ribes, Sandy; Thalgott, Jérémy; Chaussain, Catherine; Creuzet, Sophie; Lesot, Hervé; Lebrin, Franck; Poliard, Anne

    2017-12-01

    Neural crest (NC) cells are a migratory, multipotent population giving rise to numerous lineages in the embryo. Their plasticity renders attractive their use in tissue engineering-based therapies, but further knowledge on their in vivo behavior is required before clinical transfer may be envisioned. We here describe the isolation and characterization of a new mouse embryonic stem (ES) line derived from Wnt1-CRE-R26 Rosa TomatoTdv blastocyst and show that it displays the characteristics of typical ES cells. Further, these cells can be efficiently directed toward an NC stem cell-like phenotype as attested by concomitant expression of NC marker genes and Tomato fluorescence. As native NC progenitors, they are capable of differentiating toward typical derivative phenotypes and interacting with embryonic tissues to participate in the formation of neo-structures. Their specific fluorescence allows purification and tracking in vivo. This cellular tool should facilitate a better understanding of the mechanisms driving NC fate specification and help identify the key interactions developed within a tissue after in vivo implantation. Altogether, this novel model may provide important knowledge to optimize NC stem cell graft conditions, which are required for efficient tissue repair.

  9. RTU Comparison Calculator Enhancement Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, James D.; Wang, Weimin; Katipamula, Srinivas

    Over the past two years, Department of Energy’s Building Technologies Office (BTO) has been investigating ways to increase the operating efficiency of the packaged rooftop units (RTUs) in the field. First, by issuing a challenge to the RTU manufactures to increase the integrated energy efficiency ratio (IEER) by 60% over the existing ASHRAE 90.1-2010 standard. Second, by evaluating the performance of an advanced RTU controller that reduces the energy consumption by over 40%. BTO has previously also funded development of a RTU comparison calculator (RTUCC). RTUCC is a web-based tool that provides the user a way to compare energy andmore » cost savings for two units with different efficiencies. However, the RTUCC currently cannot compare savings associated with either the RTU Challenge unit or the advanced RTU controls retrofit. Therefore, BTO has asked PNNL to enhance the tool so building owners can compare energy and savings associated with this new class of products. This document provides the details of the enhancements that are required to support estimating energy savings from use of RTU challenge units or advanced controls on existing RTUs.« less

  10. Study on electroplating technology of diamond tools for machining hard and brittle materials

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Chen, Jian Hua; Sun, Li Peng; Wang, Yue

    2016-10-01

    With the development of the high speed cutting, the ultra-precision machining and ultrasonic vibration technique in processing hard and brittle material , the requirement of cutting tools is becoming higher and higher. As electroplated diamond tools have distinct advantages, such as high adaptability, high durability, long service life and good dimensional stability, the cutting tools are effective and extensive used in grinding hard and brittle materials. In this paper, the coating structure of electroplating diamond tool is described. The electroplating process flow is presented, and the influence of pretreatment on the machining quality is analyzed. Through the experimental research and summary, the reasonable formula of the electrolyte, the electroplating technologic parameters and the suitable sanding method were determined. Meanwhile, the drilling experiment on glass-ceramic shows that the electroplating process can effectively improve the cutting performance of diamond tools. It has laid a good foundation for further improving the quality and efficiency of the machining of hard and brittle materials.

  11. Development of the Next Generation of Biogeochemistry Simulations Using EMSL's NWChem Molecular Modeling Software

    NASA Astrophysics Data System (ADS)

    Bylaska, E. J.; Kowalski, K.; Apra, E.; Govind, N.; Valiev, M.

    2017-12-01

    Methods of directly simulating the behavior of complex strongly interacting atomic systems (molecular dynamics, Monte Carlo) have provided important insight into the behavior of nanoparticles, biogeochemical systems, mineral/fluid systems, nanoparticles, actinide systems and geofluids. The limitation of these methods to even wider applications is the difficulty of developing accurate potential interactions in these systems at the molecular level that capture their complex chemistry. The well-developed tools of quantum chemistry and physics have been shown to approach the accuracy required. However, despite the continuous effort being put into improving their accuracy and efficiency, these tools will be of little value to condensed matter problems without continued improvements in techniques to traverse and sample the high-dimensional phase space needed to span the ˜10^12 time scale differences between molecular simulation and chemical events. In recent years, we have made considerable progress in developing electronic structure and AIMD methods tailored to treat biochemical and geochemical problems, including very efficient implementations of many-body methods, fast exact exchange methods, electron-transfer methods, excited state methods, QM/MM, and new parallel algorithms that scale to +100,000 cores. The poster will focus on the fundamentals of these methods and the realities in terms of system size, computational requirements and simulation times that are required for their application to complex biogeochemical systems.

  12. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  13. Energy Losses Estimation During Pulsed-Laser Seam Welding

    NASA Astrophysics Data System (ADS)

    Sebestova, Hana; Havelkova, Martina; Chmelickova, Hana

    2014-06-01

    The finite-element tool SYSWELD (ESI Group, Paris, France) was adapted to simulate pulsed-laser seam welding. Besides temperature field distribution, one of the possible outputs of the welding simulation is the amount of absorbed power necessary to melt the required material volume including energy losses. Comparing absorbed or melting energy with applied laser energy, welding efficiencies can be calculated. This article presents achieved results of welding efficiency estimation based on the assimilation both experimental and simulation output data of the pulsed Nd:YAG laser bead on plate welding of 0.6-mm-thick AISI 304 stainless steel sheets using different beam powers.

  14. An aspect-oriented approach for designing safety-critical systems

    NASA Astrophysics Data System (ADS)

    Petrov, Z.; Zaykov, P. G.; Cardoso, J. P.; Coutinho, J. G. F.; Diniz, P. C.; Luk, W.

    The development of avionics systems is typically a tedious and cumbersome process. In addition to the required functions, developers must consider various and often conflicting non-functional requirements such as safety, performance, and energy efficiency. Certainly, an integrated approach with a seamless design flow that is capable of requirements modelling and supporting refinement down to an actual implementation in a traceable way, may lead to a significant acceleration of development cycles. This paper presents an aspect-oriented approach supported by a tool chain that deals with functional and non-functional requirements in an integrated manner. It also discusses how the approach can be applied to development of safety-critical systems and provides experimental results.

  15. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  16. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  17. The Hematopoietic Expression Viewer: expanding mobile apps as a scientific tool.

    PubMed

    James, Regis A; Rao, Mitchell M; Chen, Edward S; Goodell, Margaret A; Shaw, Chad A

    2012-07-15

    Many important data in current biological science comprise hundreds, thousands or more individual results. These massive data require computational tools to navigate results and effectively interact with the content. Mobile device apps are an increasingly important tool in the everyday lives of scientists and non-scientists alike. These software present individuals with compact and efficient tools to interact with complex data at meetings or other locations remote from their main computing environment. We believe that apps will be important tools for biologists, geneticists and physicians to review content while participating in biomedical research or practicing medicine. We have developed a prototype app for displaying gene expression data using the iOS platform. To present the software engineering requirements, we review the model-view-controller schema for Apple's iOS. We apply this schema to a simple app for querying locally developed microarray gene expression data. The challenge of this application is to balance between storing content locally within the app versus obtaining it dynamically via a network connection. The Hematopoietic Expression Viewer is available at http://www.shawlab.org/he_viewer. The source code for this project and any future information on how to obtain the app can be accessed at http://www.shawlab.org/he_viewer.

  18. Automated payload experiment tool feasibility study

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Clark, James; Delugach, Harry; Hammons, Charles; Logan, Julie; Provancha, Anna

    1991-01-01

    To achieve an environment less dependent on the flow of paper, automated techniques of data storage and retrieval must be utilized. The prototype under development seeks to demonstrate the ability of a knowledge-based, hypertext computer system. This prototype is concerned with the logical links between two primary NASA support documents, the Science Requirements Document (SRD) and the Engineering Requirements Document (ERD). Once developed, the final system should have the ability to guide a principal investigator through the documentation process in a more timely and efficient manner, while supplying more accurate information to the NASA payload developer.

  19. The Design and Development of a Management Information System for the Monterey Navy Flying Club.

    DTIC Science & Technology

    1986-03-27

    Management Information System for the Monterey Navy Flying Club. It supplies the tools necessary to enable the club manager to maintain all club records and generate required administrative and financial reports. The Monterey Navy Flying Club has one of the largest memberships of the Navy sponsored flying clubs. As a result of this large membership and the amount of manual paperwork required to properly maintain club records, the Manager’s ability to provide necessary services and reports in severely hampered. The implementation of an efficient

  20. In Interactive, Web-Based Approach to Metadata Authoring

    NASA Technical Reports Server (NTRS)

    Pollack, Janine; Wharton, Stephen W. (Technical Monitor)

    2001-01-01

    NASA's Global Change Master Directory (GCMD) serves a growing number of users by assisting the scientific community in the discovery of and linkage to Earth science data sets and related services. The GCMD holds over 8000 data set descriptions in Directory Interchange Format (DIF) and 200 data service descriptions in Service Entry Resource Format (SERF), encompassing the disciplines of geology, hydrology, oceanography, meteorology, and ecology. Data descriptions also contain geographic coverage information, thus allowing researchers to discover data pertaining to a particular geographic location, as well as subject of interest. The GCMD strives to be the preeminent data locator for world-wide directory level metadata. In this vein, scientists and data providers must have access to intuitive and efficient metadata authoring tools. Existing GCMD tools are not currently attracting. widespread usage. With usage being the prime indicator of utility, it has become apparent that current tools must be improved. As a result, the GCMD has released a new suite of web-based authoring tools that enable a user to create new data and service entries, as well as modify existing data entries. With these tools, a more interactive approach to metadata authoring is taken, as they feature a visual "checklist" of data/service fields that automatically update when a field is completed. In this way, the user can quickly gauge which of the required and optional fields have not been populated. With the release of these tools, the Earth science community will be further assisted in efficiently creating quality data and services metadata. Keywords: metadata, Earth science, metadata authoring tools

  1. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  2. Monitoring Java Programs with Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We present recent work on the development Java PathExplorer (JPAX), a tool for monitoring the execution of Java programs. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program's late code which will then omit events to an observer during its execution. The observer checks the events against user provided high level requirement specifications, for example temporal logic formulae, and against lower level error detection procedures, for example concurrency related such as deadlock and data race algorithms. High level requirement specifications together with their underlying logics are defined in the Maude rewriting logic, and then can either be directly checked using the Maude rewriting engine, or be first translated to efficient data structures and then checked in Java.

  3. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  4. System analysis tools for an ELT at ESO

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Koch, Franz

    2006-06-01

    Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

  5. Aeroelastic Optimization Study Based on the X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley W.; Pak, Chan-Gi

    2014-01-01

    One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.

  6. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  7. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2004-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  8. Improving smoothing efficiency of rigid conformal polishing tool using time-dependent smoothing evaluation model

    NASA Astrophysics Data System (ADS)

    Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng

    2017-06-01

    A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o

  9. Two-dimensional gap analysis: a tool for efficient conservation planning and biodiversity policy implementation.

    PubMed

    Angelstam, Per; Mikusiński, Grzegorz; Rönnbäck, Britt-Inger; Ostman, Anders; Lazdinis, Marius; Roberge, Jean-Michel; Arnberg, Wolter; Olsson, Jan

    2003-12-01

    The maintenance of biodiversity by securing representative and well-connected habitat networks in managed landscapes requires a wise combination of protection, management, and restoration of habitats at several scales. We suggest that the integration of natural and social sciences in the form of "Two-dimensional gap analysis" is an efficient tool for the implementation of biodiversity policies. The tool links biologically relevant "horizontal" ecological issues with "vertical" issues related to institutions and other societal issues. Using forest biodiversity as an example, we illustrate how one can combine ecological and institutional aspects of biodiversity conservation, thus facilitating environmentally sustainable regional development. In particular, we use regional gap analysis for identification of focal forest types, habitat modelling for ascertaining the functional connectivity of "green infrastructures", as tools for the horizontal gap analysis. For the vertical dimension we suggest how the social sciences can be used for assessing the success in the implementation of biodiversity policies in real landscapes by identifying institutional obstacles while implementing policies. We argue that this interdisciplinary approach could be applied in a whole range of other environments including other terrestrial biota and aquatic ecosystems where functional habitat connectivity, nonlinear response to habitat loss and a multitude of economic and social interests co-occur in the same landscape.

  10. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    PubMed

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  11. Development and Implementation of Team-Based Panel Management Tools: Filling the Gap between Patient and Population Information Systems.

    PubMed

    Watts, Brook; Lawrence, Renée H; Drawz, Paul; Carter, Cameron; Shumaker, Amy Hirsch; Kern, Elizabeth F

    2016-08-01

    Effective team-based models of care, such as the Patient-Centered Medical Home, require electronic tools to support proactive population management strategies that emphasize care coordination and quality improvement. Despite the spread of electronic health records (EHRs) and vendors marketing population health tools, clinical practices still may lack the ability to have: (1) local control over types of data collected/reports generated, (2) timely data (eg, up-to-date data, not several months old), and accordingly (3) the ability to efficiently monitor and improve patient outcomes. This article describes a quality improvement project at the hospital system level to develop and implement a flexible panel management (PM) tool to improve care of subpopulations of patients (eg, panels of patients with diabetes) by clinical teams. An in-depth case analysis approach is used to explore barriers and facilitators in building a PM registry tool for team-based management needs using standard data elements (eg, laboratory values, pharmacy records) found in EHRs. Also described are factors that may contribute to sustainability; to date the tool has been adapted to 6 disease-focused subpopulations encompassing more than 200,000 patients. Two key lessons emerged from this initiative: (1) though challenging, team-based clinical end users and information technology needed to work together consistently to refine the product, and (2) locally developed population management tools can provide efficient data tracking for frontline clinical teams and leadership. The preliminary work identified critical gaps that were successfully addressed by building local PM registry tools from EHR-derived data and offers lessons learned for others engaged in similar work. (Population Health Management 2016;19:232-239).

  12. Laboratory automation of high-quality and efficient ligand-binding assays for biotherapeutic drug development.

    PubMed

    Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean

    2013-07-01

    Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.

  13. Expert System for Building TRU Waste Payloads - 13554

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruemmer, Heather; Slater, Bryant

    2013-07-01

    The process for grouping TRU waste drums into payloads for shipment to the Waste Isolation Pilot Plant (WIPP) for disposal is a very complex process. Transportation and regulatory requirements must be met, along with striving for the goals of shipment efficiency: maximize the number of waste drums in a shipment and minimize the use of empty drums which take up precious underground storage space. The restrictions on payloads range from weight restrictions, to limitations on flammable gas in the headspace, to minimum TRU alpha activity concentration requirements. The Overpack and Payload Assistant Tool (OPAT) has been developed as a mixed-initiativemore » intelligent system within the WIPP Waste Data System (WDS) to guide the construction of multiple acceptable payloads. OPAT saves the user time while at the same time maximizes the efficiency of shipments for the given drum population. The tool provides the user with the flexibility to tune critical factors that guide OPAT's operation based on real-time feedback concerning the results of the execution. This feedback complements the user's external knowledge of the drum population (such as location of drums, known challenges, internal shipment goals). This work demonstrates how software can be utilized to complement the unique domain knowledge of the users. The mixed-initiative approach combines the insight and intuition of the human expert with the proficiency of automated computational algorithms. The result is the ability to thoroughly and efficiently explore the search space of possible solutions and derive the best waste management decision. (authors)« less

  14. Agents for Plan Monitoring and Repair

    DTIC Science & Technology

    2003-04-01

    events requires time and effort. In this paper, we describe how Heracles and Theseus , two information gathering and monitoring tools that we built...on an information agent platform, called Theseus , that provides the technology for efficiently executing agents for information gather- ing and...we can easily define a system for interactively planning a trip. The second is the Theseus information agent platform [Barish et al., 2000], which

  15. Skills Needed to Survive and Thrive as a Scholar in the 21st Century: Information, Knowledge, and Publication Management

    ERIC Educational Resources Information Center

    Conceição, Simone C. O.

    2013-01-01

    The changes in the way our work is created, published, and disseminated have implications for our own professional development and require us to be aware of the necessary skills. In this article, I identify three important skills scholars need to have and tools to be effective, efficient, and productive scholars in the 21st century: information…

  16. Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis

    PubMed Central

    2015-01-01

    Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276

  17. Operations management: a tool to increase profitability.

    PubMed

    Mulvehill, M J

    2001-03-01

    Operations management enables the efficient utilization of the production systems in a business. This paper will address several key elements in the business competency of operations management. Specifically, this discussion will review the components of a material requirement planning system and a "just-in-time" system for inventory control and time management to enable the dentist to monitor a portion of the practice's overhead costs.

  18. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  19. FDT 2.0: Improving scalability of the fuzzy decision tree induction tool - integrating database storage.

    PubMed

    Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W

    2014-12-01

    Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.

  20. Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning

    NASA Astrophysics Data System (ADS)

    Cui, J.; Dong, B.; Li, J.; Li, L.

    2017-09-01

    As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.

  1. EasyModeller: A graphical interface to MODELLER

    PubMed Central

    2010-01-01

    Background MODELLER is a program for automated protein Homology Modeling. It is one of the most widely used tool for homology or comparative modeling of protein three-dimensional structures, but most users find it a bit difficult to start with MODELLER as it is command line based and requires knowledge of basic Python scripting to use it efficiently. Findings The study was designed with an aim to develop of "EasyModeller" tool as a frontend graphical interface to MODELLER using Perl/Tk, which can be used as a standalone tool in windows platform with MODELLER and Python preinstalled. It helps inexperienced users to perform modeling, assessment, visualization, and optimization of protein models in a simple and straightforward way. Conclusion EasyModeller provides a graphical straight forward interface and functions as a stand-alone tool which can be used in a standard personal computer with Microsoft Windows as the operating system. PMID:20712861

  2. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  3. Laser processes and system technology for the production of high-efficient crystalline solar cells

    NASA Astrophysics Data System (ADS)

    Mayerhofer, R.; Hendel, R.; Zhu, Wenjie; Geiger, S.

    2012-10-01

    The laser as an industrial tool is an essential part of today's solar cell production. Due to the on-going efforts in the solar industry, to increase the cell efficiency, more and more laser-based processes, which have been discussed and tested at lab-scale for many years, are now being implemented in mass production lines. In order to cope with throughput requirements, standard laser concepts have to be improved continuously with respect to available average power levels, repetition rates or beam profile. Some of the laser concepts, that showed high potential in the past couple of years, will be substituted by other, more economic laser types. Furthermore, requirements for processing with less-heat affected zones fuel the development of industry-ready ultra short pulsed lasers with pulse widths even below the picosecond range. In 2011, the German Ministry of Education and Research (BMBF) had launched the program "PV-Innovation Alliance", with the aim to support the rapid transfer of high-efficiency processes out of development departments and research institutes into solar cell production lines. Here, lasers play an important role as production tools, allowing the fast implementation of high-performance solar cell concepts. We will report on the results achieved within the joint project FUTUREFAB, where efficiency optimization, throughput enhancement and cost reduction are the main goals. Here, the presentation will focus on laser processes like selective emitter doping and ablation of dielectric layers. An indispensable part of the efforts towards cost reduction in solar cell production is the improvement of wafer handling and throughput capabilities of the laser processing system. Therefore, the presentation will also elaborate on new developments in the design of complete production machines.

  4. Progress Toward an Efficient and General CFD Tool for Propulsion Design/Analysis

    NASA Technical Reports Server (NTRS)

    Cox, C. F.; Cinnella, P.; Westmoreland, S.

    1996-01-01

    The simulation of propulsive flows inherently involves chemical activity. Recent years have seen substantial strides made in the development of numerical schemes for reacting flowfields, in particular those involving finite-rate chemistry. However, finite-rate calculations are computationally intensive and require knowledge of the actual kinetics, which are not always known with sufficient accuracy. Alternatively, flow simulations based on the assumption of local chemical equilibrium are capable of obtaining physically reasonable results at far less computational cost. The present study summarizes the development of efficient numerical techniques for the simulation of flows in local chemical equilibrium, whereby a 'Black Box' chemical equilibrium solver is coupled to the usual gasdynamic equations. The generalization of the methods enables the modelling of any arbitrary mixture of thermally perfect gases, including air, combustion mixtures and plasmas. As demonstration of the potential of the methodologies, several solutions, involving reacting and perfect gas flows, will be presented. Included is a preliminary simulation of the SSME startup transient. Future enhancements to the proposed techniques will be discussed, including more efficient finite-rate and hybrid (partial equilibrium) schemes. The algorithms that have been developed and are being optimized provide for an efficient and general tool for the design and analysis of propulsion systems.

  5. Efficient Use of Distributed Systems for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques

    2000-01-01

    Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.

  6. Evaluation of plant biomass resources available for replacement of fossil oil

    PubMed Central

    Henry, Robert J

    2010-01-01

    The potential of plants to replace fossil oil was evaluated by considering the scale of production required, the area of land needed and the types of plants available. High yielding crops (50 tonnes/ha) that have a high conversion efficiency (75%) would require a global land footprint of around 100 million ha to replace current (2008) oil consumption. Lower yielding or less convertible plants would require a larger land footprint. Domestication of new species as dedicated energy crops may be necessary. A systematic analysis of higher plants and their current and potential uses is presented. Plant biotechnology provides tools to improve the prospects of replacing oil with plant-derived biomass by increasing the amount of biomass produced per unit area of land and improving the composition of the biomass to increase the efficiency of conversion to biofuel and biomaterials. Options for the production of high value coproducts and the expression of processing aids such as enzymes in the plant may add further value to plants as bioenergy resources. PMID:20070873

  7. AZTEC. Parallel Iterative method Software for Solving Linear Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hutchinson, S.; Shadid, J.; Tuminaro, R.

    1995-07-01

    AZTEC is an interactive library that greatly simplifies the parrallelization process when solving the linear systems of equations Ax=b where A is a user supplied n X n sparse matrix, b is a user supplied vector of length n and x is a vector of length n to be computed. AZTEC is intended as a software tool for users who want to avoid cumbersome parallel programming details but who have large sparse linear systems which require an efficiently utilized parallel processing system. A collection of data transformation tools are provided that allow for easy creation of distributed sparse unstructured matricesmore » for parallel solutions.« less

  8. User Instructions for the Policy Analysis Modeling System (PAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.

    PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.

  9. Gemi: PCR Primers Prediction from Multiple Alignments

    PubMed Central

    Sobhy, Haitham; Colson, Philippe

    2012-01-01

    Designing primers and probes for polymerase chain reaction (PCR) is a preliminary and critical step that requires the identification of highly conserved regions in a given set of sequences. This task can be challenging if the targeted sequences display a high level of diversity, as frequently encountered in microbiologic studies. We developed Gemi, an automated, fast, and easy-to-use bioinformatics tool with a user-friendly interface to design primers and probes based on multiple aligned sequences. This tool can be used for the purpose of real-time and conventional PCR and can deal efficiently with large sets of sequences of a large size. PMID:23316117

  10. Two Simple and Efficient Algorithms to Compute the SP-Score Objective Function of a Multiple Sequence Alignment.

    PubMed

    Ranwez, Vincent

    2016-01-01

    Multiple sequence alignment (MSA) is a crucial step in many molecular analyses and many MSA tools have been developed. Most of them use a greedy approach to construct a first alignment that is then refined by optimizing the sum of pair score (SP-score). The SP-score estimation is thus a bottleneck for most MSA tools since it is repeatedly required and is time consuming. Given an alignment of n sequences and L sites, I introduce here optimized solutions reaching O(nL) time complexity for affine gap cost, instead of O(n2L), which are easy to implement.

  11. Why Lean doesn't work for everyone.

    PubMed

    Kaplan, Gary S; Patterson, Sarah H; Ching, Joan M; Blackmore, C Craig

    2014-12-01

    Popularisation of Lean in healthcare has led to emphasis on Lean quality improvement tools in isolation, with inconsistent results. We argue that delivery of safer, more efficient, and higher quality-patient focused care requires organisational transformation of which the Lean toolkit is only one component. To successfully facilitate system transformation toward higher quality care at lower cost, Lean tools must be part of a comprehensive management system, within a supportive institutional culture, and with committed leadership. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. NEMOTAM: tangent and adjoint models for the ocean modelling platform NEMO

    NASA Astrophysics Data System (ADS)

    Vidard, A.; Bouttier, P.-A.; Vigilant, F.

    2015-04-01

    Tangent linear and adjoint models (TAMs) are efficient tools to analyse and to control dynamical systems such as NEMO. They can be involved in a large range of applications such as sensitivity analysis, parameter estimation or the computation of characteristic vectors. A TAM is also required by the 4D-Var algorithm, which is one of the major methods in data assimilation. This paper describes the development and the validation of the tangent linear and adjoint model for the NEMO ocean modelling platform (NEMOTAM). The diagnostic tools that are available alongside NEMOTAM are detailed and discussed, and several applications are also presented.

  13. NEMOTAM: tangent and adjoint models for the ocean modelling platform NEMO

    NASA Astrophysics Data System (ADS)

    Vidard, A.; Bouttier, P.-A.; Vigilant, F.

    2014-10-01

    The tangent linear and adjoint model (TAM) are efficient tools to analyse and to control dynamical systems such as NEMO. They can be involved in a large range of applications such as sensitivity analysis, parameter estimation or the computation of characteristics vectors. TAM is also required by the 4-D-VAR algorithm which is one of the major method in Data Assimilation. This paper describes the development and the validation of the Tangent linear and Adjoint Model for the NEMO ocean modelling platform (NEMOTAM). The diagnostic tools that are available alongside NEMOTAM are detailed and discussed and several applications are also presented.

  14. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  15. Methods to estimate irrigated reference crop evapotranspiration - a review.

    PubMed

    Kumar, R; Jat, M K; Shankar, V

    2012-01-01

    Efficient water management of crops requires accurate irrigation scheduling which, in turn, requires the accurate measurement of crop water requirement. Irrigation is applied to replenish depleted moisture for optimum plant growth. Reference evapotranspiration plays an important role for the determination of water requirements for crops and irrigation scheduling. Various models/approaches varying from empirical to physically base distributed are available for the estimation of reference evapotranspiration. Mathematical models are useful tools to estimate the evapotranspiration and water requirement of crops, which is essential information required to design or choose best water management practices. In this paper the most commonly used models/approaches, which are suitable for the estimation of daily water requirement for agricultural crops grown in different agro-climatic regions, are reviewed. Further, an effort has been made to compare the accuracy of various widely used methods under different climatic conditions.

  16. Highly efficient gene inactivation by adenoviral CRISPR/Cas9 in human primary cells

    PubMed Central

    Tielen, Frans; Elstak, Edo; Benschop, Julian; Grimbergen, Max; Stallen, Jan; Janssen, Richard; van Marle, Andre; Essrich, Christian

    2017-01-01

    Phenotypic assays using human primary cells are highly valuable tools for target discovery and validation in drug discovery. Expression knockdown (KD) of such targets in these assays allows the investigation of their role in models of disease processes. Therefore, efficient and fast modes of protein KD in phenotypic assays are required. The CRISPR/Cas9 system has been shown to be a versatile and efficient means of gene inactivation in immortalized cell lines. Here we describe the use of adenoviral (AdV) CRISPR/Cas9 vectors for efficient gene inactivation in two human primary cell types, normal human lung fibroblasts and human bronchial epithelial cells. The effects of gene inactivation were studied in the TGF-β-induced fibroblast to myofibroblast transition assay (FMT) and the epithelial to mesenchymal transition assay (EMT), which are SMAD3 dependent and reflect pathogenic mechanisms observed in fibrosis. Co-transduction (co-TD) of AdV Cas9 with SMAD3-targeting guide RNAs (gRNAs) resulted in fast and efficient genome editing judged by insertion/deletion (indel) formation, as well as significant reduction of SMAD3 protein expression and nuclear translocation. This led to phenotypic changes downstream of SMAD3 inhibition, including substantially decreased alpha smooth muscle actin and fibronectin 1 expression, which are markers for FMT and EMT, respectively. A direct comparison between co-TD of separate Cas9 and gRNA AdV, versus TD with a single “all-in-one” Cas9/gRNA AdV, revealed that both methods achieve similar levels of indel formation. These data demonstrate that AdV CRISPR/Cas9 is a useful and efficient tool for protein KD in human primary cell phenotypic assays. The use of AdV CRISPR/Cas9 may offer significant advantages over the current existing tools and should enhance target discovery and validation opportunities. PMID:28800587

  17. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.

  18. Atropos: specific, sensitive, and speedy trimming of sequencing reads.

    PubMed

    Didion, John P; Martin, Marcel; Collins, Francis S

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos.

  19. Atropos: specific, sensitive, and speedy trimming of sequencing reads

    PubMed Central

    Collins, Francis S.

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos. PMID:28875074

  20. Proteinortho: detection of (co-)orthologs in large-scale analysis.

    PubMed

    Lechner, Marcus; Findeiss, Sven; Steiner, Lydia; Marz, Manja; Stadler, Peter F; Prohaska, Sonja J

    2011-04-28

    Orthology analysis is an important part of data analysis in many areas of bioinformatics such as comparative genomics and molecular phylogenetics. The ever-increasing flood of sequence data, and hence the rapidly increasing number of genomes that can be compared simultaneously, calls for efficient software tools as brute-force approaches with quadratic memory requirements become infeasible in practise. The rapid pace at which new data become available, furthermore, makes it desirable to compute genome-wide orthology relations for a given dataset rather than relying on relations listed in databases. The program Proteinortho described here is a stand-alone tool that is geared towards large datasets and makes use of distributed computing techniques when run on multi-core hardware. It implements an extended version of the reciprocal best alignment heuristic. We apply Proteinortho to compute orthologous proteins in the complete set of all 717 eubacterial genomes available at NCBI at the beginning of 2009. We identified thirty proteins present in 99% of all bacterial proteomes. Proteinortho significantly reduces the required amount of memory for orthology analysis compared to existing tools, allowing such computations to be performed on off-the-shelf hardware.

  1. Advanced Tools for River Science: EAARL and MD_SWMS: Chapter 3

    USGS Publications Warehouse

    Kinzel, Paul J.

    2009-01-01

    Disruption of flow regimes and sediment supplies, induced by anthropogenic or climatic factors, can produce dramatic alterations in river form, vegetation patterns, and associated habitat conditions. To improve habitat in these fluvial systems, resource managers may choose from a variety of treatments including flow and/or sediment prescriptions, vegetation management, or engineered approaches. Monitoring protocols developed to assess the morphologic response of these treatments require techniques that can measure topographic changes above and below the water surface efficiently, accurately, and in a standardized, cost-effective manner. Similarly, modeling of flow, sediment transport, habitat, and channel evolution requires characterization of river morphology for model input and verification. Recent developments by the U.S. Geological Survey with regard to both remotely sensed methods (the Experimental Advanced Airborne Research LiDAR; EAARL) and computational modeling software (the Multi-Dimensional Surface-Water Modeling System; MD_SWMS) have produced advanced tools for spatially explicit monitoring and modeling in aquatic environments. In this paper, we present a pilot study conducted along the Platte River, Nebraska, that demonstrates the combined use of these river science tools.

  2. Individual and social learning processes involved in the acquisition and generalization of tool use in macaques

    PubMed Central

    Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.

    2012-01-01

    Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424

  3. ORAC: 21st Century Observing at UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, A.; Wright, G. S.; Tan, M.; Pickup, D. A.; Economou, F.; Currie, M. J.; Adamson, A. J.; Rees, N. P.; Purves, M. H.

    The Observatory Reduction and Acquisition Control system replaces all of the existing software which interacts with the observers at UKIRT. The aim is to improve observing efficiency with a set of integrated tools that take the user from pre-observing preparation, through the acquisition of observations to the reduction using a data-driven pipeline. ORAC is designed to be flexible and extensible, and is intended for use with all future UKIRT instruments, as well as existing telescope hardware and ``legacy'' instruments. It is also designed to allow integration with phase-1 and queue-scheduled observing tools in anticipation of possible future requirements. A brief overview of the project and its relationship to other systems is given. ORAC also re-uses much code from other systems and we discuss issues relating to the trade-off between reuse and the generation of new software specific to our requirements.

  4. Quality tracing in meat supply chains

    PubMed Central

    Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith

    2014-01-01

    The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production. PMID:24797136

  5. Quality tracing in meat supply chains.

    PubMed

    Mack, Miriam; Dittmer, Patrick; Veigt, Marius; Kus, Mehmet; Nehmiz, Ulfert; Kreyenschmidt, Judith

    2014-06-13

    The aim of this study was the development of a quality tracing model for vacuum-packed lamb that is applicable in different meat supply chains. Based on the development of relevant sensory parameters, the predictive model was developed by combining a linear primary model and the Arrhenius model as the secondary model. Then a process analysis was conducted to define general requirements for the implementation of the temperature-based model into a meat supply chain. The required hardware and software for continuous temperature monitoring were developed in order to use the model under practical conditions. Further on a decision support tool was elaborated in order to use the model as an effective tool in combination with the temperature monitoring equipment for the improvement of quality and storage management within the meat logistics network. Over the long term, this overall procedure will support the reduction of food waste and will improve the resources efficiency of food production.

  6. Fast algorithms for Quadrature by Expansion I: Globally valid expansions

    NASA Astrophysics Data System (ADS)

    Rachh, Manas; Klöckner, Andreas; O'Neil, Michael

    2017-09-01

    The use of integral equation methods for the efficient numerical solution of PDE boundary value problems requires two main tools: quadrature rules for the evaluation of layer potential integral operators with singular kernels, and fast algorithms for solving the resulting dense linear systems. Classically, these tools were developed separately. In this work, we present a unified numerical scheme based on coupling Quadrature by Expansion, a recent quadrature method, to a customized Fast Multipole Method (FMM) for the Helmholtz equation in two dimensions. The method allows the evaluation of layer potentials in linear-time complexity, anywhere in space, with a uniform, user-chosen level of accuracy as a black-box computational method. Providing this capability requires geometric and algorithmic considerations beyond the needs of standard FMMs as well as careful consideration of the accuracy of multipole translations. We illustrate the speed and accuracy of our method with various numerical examples.

  7. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  8. DyNAVacS: an integrative tool for optimized DNA vaccine design.

    PubMed

    Harish, Nagarajan; Gupta, Rekha; Agarwal, Parul; Scaria, Vinod; Pillai, Beena

    2006-07-01

    DNA vaccines have slowly emerged as keystones in preventive immunology due to their versatility in inducing both cell-mediated as well as humoral immune responses. The design of an efficient DNA vaccine, involves choice of a suitable expression vector, ensuring optimal expression by codon optimization, engineering CpG motifs for enhancing immune responses and providing additional sequence signals for efficient translation. DyNAVacS is a web-based tool created for rapid and easy design of DNA vaccines. It follows a step-wise design flow, which guides the user through the various sequential steps in the design of the vaccine. Further, it allows restriction enzyme mapping, design of primers spanning user specified sequences and provides information regarding the vectors currently used for generation of DNA vaccines. The web version uses Apache HTTP server. The interface was written in HTML and utilizes the Common Gateway Interface scripts written in PERL for functionality. DyNAVacS is an integrated tool consisting of user-friendly programs, which require minimal information from the user. The software is available free of cost, as a web based application at URL: http://miracle.igib.res.in/dynavac/.

  9. BEST Winery Guidebook: Benchmarking and Energy and Water SavingsTool for the Wine Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitsky, Christina; Worrell, Ernst; Radspieler, Anthony

    2005-10-15

    Not all industrial facilities have the staff or the opportunity to perform a detailed audit of their operations. The lack of knowledge of energy efficiency opportunities provides an important barrier to improving efficiency. Benchmarking has demonstrated to help energy users understand energy use and the potential for energy efficiency improvement, reducing the information barrier. In California, the wine making industry is not only one of the economic pillars of the economy; it is also a large energy consumer, with a considerable potential for energy-efficiency improvement. Lawrence Berkeley National Laboratory and Fetzer Vineyards developed an integrated benchmarking and self-assessment tool formore » the California wine industry called ''BEST''(Benchmarking and Energy and water Savings Tool) Winery. BEST Winery enables a winery to compare its energy efficiency to a best practice winery, accounting for differences in product mix and other characteristics of the winery. The tool enables the user to evaluate the impact of implementing energy and water efficiency measures. The tool facilitates strategic planning of efficiency measures, based on the estimated impact of the measures, their costs and savings. BEST Winery is available as a software tool in an Excel environment. This report serves as background material, documenting assumptions and information on the included energy and water efficiency measures. It also serves as a user guide for the software package.« less

  10. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  11. Sugar microanalysis by HPLC with benzoylation: improvement via introduction of a C-8 cartridge and a high efficiency ODS column.

    PubMed

    Miyagi, Michiko; Yokoyama, Hirokazu; Hibi, Toshifumi

    2007-07-01

    An HPLC protocol for sugar microanalysis based on the formation of ultraviolet-absorbing benzoyl chloride derivatives was improved. Here, samples were prepared with a C-8 cartridge and analyzed with a high efficiency ODS column, in which porous spherical silica particles 3 microm in diameter were packed. These devices allowed us to simultaneously quantify multiple sugars and sugar alcohols up to 10 ng/ml and to provide satisfactory separations of some sugars, such as fructose and myo-inositol and sorbitol and mannitol. This protocol, which does not require special apparatuses, should become a powerful tool in sugar research.

  12. Advanced Solar Power Systems

    NASA Technical Reports Server (NTRS)

    Atkinson, J. H.; Hobgood, J. M.

    1984-01-01

    The Advanced Solar Power System (ASPS) concentrator uses a technically sophisticated design and extensive tooling to produce very efficient (80 to 90%) and versatile energy supply equipment which is inexpensive to manufacture and requires little maintenance. The advanced optical design has two 10th order, generalized aspheric surfaces in a Cassegrainian configuration which gives outstanding performance and is relatively insensitive to temperature changes and wind loading. Manufacturing tolerances also have been achieved. The key to the ASPS is the direct absorption of concentrated sunlight in the working fluid by radiative transfers in a black body cavity. The basic ASPS design concepts, efficiency, optical system, and tracking and focusing controls are described.

  13. Statistical Capability Study of a Helical Grinding Machine Producing Screw Rotors

    NASA Astrophysics Data System (ADS)

    Holmes, C. S.; Headley, M.; Hart, P. W.

    2017-08-01

    Screw compressors depend for their efficiency and reliability on the accuracy of the rotors, and therefore on the machinery used in their production. The machinery has evolved over more than half a century in response to customer demands for production accuracy, efficiency, and flexibility, and is now at a high level on all three criteria. Production equipment and processes must be capable of maintaining accuracy over a production run, and this must be assessed statistically under strictly controlled conditions. This paper gives numerical data from such a study of an innovative machine tool and shows that it is possible to meet the demanding statistical capability requirements.

  14. An implicit higher-order spatially accurate scheme for solving time dependent flows on unstructured meshes

    NASA Astrophysics Data System (ADS)

    Tomaro, Robert F.

    1998-07-01

    The present research is aimed at developing a higher-order, spatially accurate scheme for both steady and unsteady flow simulations using unstructured meshes. The resulting scheme must work on a variety of general problems to ensure the creation of a flexible, reliable and accurate aerodynamic analysis tool. To calculate the flow around complex configurations, unstructured grids and the associated flow solvers have been developed. Efficient simulations require the minimum use of computer memory and computational times. Unstructured flow solvers typically require more computer memory than a structured flow solver due to the indirect addressing of the cells. The approach taken in the present research was to modify an existing three-dimensional unstructured flow solver to first decrease the computational time required for a solution and then to increase the spatial accuracy. The terms required to simulate flow involving non-stationary grids were also implemented. First, an implicit solution algorithm was implemented to replace the existing explicit procedure. Several test cases, including internal and external, inviscid and viscous, two-dimensional, three-dimensional and axi-symmetric problems, were simulated for comparison between the explicit and implicit solution procedures. The increased efficiency and robustness of modified code due to the implicit algorithm was demonstrated. Two unsteady test cases, a plunging airfoil and a wing undergoing bending and torsion, were simulated using the implicit algorithm modified to include the terms required for a moving and/or deforming grid. Secondly, a higher than second-order spatially accurate scheme was developed and implemented into the baseline code. Third- and fourth-order spatially accurate schemes were implemented and tested. The original dissipation was modified to include higher-order terms and modified near shock waves to limit pre- and post-shock oscillations. The unsteady cases were repeated using the higher-order spatially accurate code. The new solutions were compared with those obtained using the second-order spatially accurate scheme. Finally, the increased efficiency of using an implicit solution algorithm in a production Computational Fluid Dynamics flow solver was demonstrated for steady and unsteady flows. A third- and fourth-order spatially accurate scheme has been implemented creating a basis for a state-of-the-art aerodynamic analysis tool.

  15. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  16. Algorithms for the Fractional Calculus: A Selection of Numerical Methods

    NASA Technical Reports Server (NTRS)

    Diethelm, K.; Ford, N. J.; Freed, A. D.; Luchko, Yu.

    2003-01-01

    Many recently developed models in areas like viscoelasticity, electrochemistry, diffusion processes, etc. are formulated in terms of derivatives (and integrals) of fractional (non-integer) order. In this paper we present a collection of numerical algorithms for the solution of the various problems arising in this context. We believe that this will give the engineer the necessary tools required to work with fractional models in an efficient way.

  17. Mass Transfer from Entrapped DNAPL Sources Undergoing Remediation: Characterization Methods and Prediction Tools

    DTIC Science & Technology

    2006-08-31

    volumetric depletion efficiency ( VDE ) considers how much DNAPL is depleted from the system, relative to the total volume of solution flushed through the...aqueous phase contaminant. VDE is important to consider, as conditions that result in the fastest mass transfer, highest enhancement, or best MTE, may...volumes of flushing fluid, maximizing DNAPL depletion while minimizing flushing volume requirements may be desirable from a remediation standpoint. VDE

  18. Clustering methods applied in the detection of Ki67 hot-spots in whole tumor slide images: an efficient way to characterize heterogeneous tissue-based biomarkers.

    PubMed

    Lopez, Xavier Moles; Debeir, Olivier; Maris, Calliope; Rorive, Sandrine; Roland, Isabelle; Saerens, Marco; Salmon, Isabelle; Decaestecker, Christine

    2012-09-01

    Whole-slide scanners allow the digitization of an entire histological slide at very high resolution. This new acquisition technique opens a wide range of possibilities for addressing challenging image analysis problems, including the identification of tissue-based biomarkers. In this study, we use whole-slide scanner technology for imaging the proliferating activity patterns in tumor slides based on Ki67 immunohistochemistry. Faced with large images, pathologists require tools that can help them identify tumor regions that exhibit high proliferating activity, called "hot-spots" (HSs). Pathologists need tools that can quantitatively characterize these HS patterns. To respond to this clinical need, the present study investigates various clustering methods with the aim of identifying Ki67 HSs in whole tumor slide images. This task requires a method capable of identifying an unknown number of clusters, which may be highly variable in terms of shape, size, and density. We developed a hybrid clustering method, referred to as Seedlink. Compared to manual HS selections by three pathologists, we show that Seedlink provides an efficient way of detecting Ki67 HSs and improves the agreement among pathologists when identifying HSs. Copyright © 2012 International Society for Advancement of Cytometry.

  19. Integration of agronomic practices with herbicides for sustainable weed management in aerobic rice.

    PubMed

    Anwar, M P; Juraimi, A S; Mohamed, M T M; Uddin, M K; Samedani, B; Puteh, A; Man, Azmi

    2013-01-01

    Till now, herbicide seems to be a cost effective tool from an agronomic view point to control weeds. But long term efficacy and sustainability issues are the driving forces behind the reconsideration of herbicide dependent weed management strategy in rice. This demands reappearance of physical and cultural management options combined with judicious herbicide application in a more comprehensive and integrated way. Keeping those in mind, some agronomic tools along with different manual weeding and herbicides combinations were evaluated for their weed control efficacy in rice under aerobic soil conditions. Combination of competitive variety, higher seeding rate, and seed priming resulted in more competitive cropping system in favor of rice, which was reflected in lower weed pressure, higher weed control efficiency, and better yield. Most of the herbicides exhibited excellent weed control efficiency. Treatments comprising only herbicides required less cost involvement but produced higher net benefit. On the contrary, treatments comprising both herbicide and manual weeding required high cost involvement and thus produced lower net benefit. Therefore, adoption of competitive rice variety, higher seed rate, and seed priming along with spraying different early-postemergence herbicides in rotation at 10 days after seeding (DAS) followed by a manual weeding at 30 DAS may be recommended from sustainability view point.

  20. Integration of Agronomic Practices with Herbicides for Sustainable Weed Management in Aerobic Rice

    PubMed Central

    Anwar, M. P.; Juraimi, A. S.; Mohamed, M. T. M.; Uddin, M. K.; Samedani, B.; Puteh, A.; Man, Azmi

    2013-01-01

    Till now, herbicide seems to be a cost effective tool from an agronomic view point to control weeds. But long term efficacy and sustainability issues are the driving forces behind the reconsideration of herbicide dependent weed management strategy in rice. This demands reappearance of physical and cultural management options combined with judicious herbicide application in a more comprehensive and integrated way. Keeping those in mind, some agronomic tools along with different manual weeding and herbicides combinations were evaluated for their weed control efficacy in rice under aerobic soil conditions. Combination of competitive variety, higher seeding rate, and seed priming resulted in more competitive cropping system in favor of rice, which was reflected in lower weed pressure, higher weed control efficiency, and better yield. Most of the herbicides exhibited excellent weed control efficiency. Treatments comprising only herbicides required less cost involvement but produced higher net benefit. On the contrary, treatments comprising both herbicide and manual weeding required high cost involvement and thus produced lower net benefit. Therefore, adoption of competitive rice variety, higher seed rate, and seed priming along with spraying different early-postemergence herbicides in rotation at 10 days after seeding (DAS) followed by a manual weeding at 30 DAS may be recommended from sustainability view point. PMID:24223513

  1. Simple Biological Systems for Assessing the Activity of Superoxide Dismutase Mimics

    PubMed Central

    Tovmasyan, Artak; Reboucas, Julio S.

    2014-01-01

    Abstract Significance: Half a century of research provided unambiguous proof that superoxide and species derived from it—reactive oxygen species (ROS)—play a central role in many diseases and degenerative processes. This stimulated the search for pharmaceutical agents that are capable of preventing oxidative damage, and methods of assessing their therapeutic potential. Recent Advances: The limitations of superoxide dismutase (SOD) as a therapeutic tool directed attention to small molecules, SOD mimics, that are capable of catalytically scavenging superoxide. Several groups of compounds, based on either metal complexes, including metalloporphyrins, metallocorroles, Mn(II) cyclic polyamines, and Mn(III) salen derivatives, or non-metal based compounds, such as fullerenes, nitrones, and nitroxides, have been developed and studied in vitro and in vivo. Very few entered clinical trials. Critical Issues and Future Directions: Development of SOD mimics requires in-depth understanding of their mechanisms of biological action. Elucidation of both molecular features, essential for efficient ROS-scavenging in vivo, and factors limiting the potential side effects requires biologically relevant and, at the same time, relatively simple testing systems. This review discuses the advantages and limitations of genetically engineered SOD-deficient unicellular organisms, Escherichia coli and Saccharomyces cerevisiae as tools for investigating the efficacy and mechanisms of biological actions of SOD mimics. These simple systems allow the scrutiny of the minimal requirements for a functional SOD mimic: the association of a high catalytic activity for superoxide dismutation, low toxicity, and an efficient cellular uptake/biodistribution. Antioxid. Redox Signal. 20, 2416–2436. PMID:23964890

  2. Comparison Of Human Modelling Tools For Efficiency Of Prediction Of EVA Tasks

    NASA Technical Reports Server (NTRS)

    Dischinger, H. Charles, Jr.; Loughead, Tomas E.

    1998-01-01

    Construction of the International Space Station (ISS) will require extensive extravehicular activity (EVA, spacewalks), and estimates of the actual time needed continue to rise. As recently as September, 1996, the amount of time to be spent in EVA was believed to be about 400 hours, excluding spacewalks on the Russian segment. This estimate has recently risen to over 1100 hours, and it could go higher before assembly begins in the summer of 1998. These activities are extremely expensive and hazardous, so any design tools which help assure mission success and improve the efficiency of the astronaut in task completion can pay off in reduced design and EVA costs and increased astronaut safety. The tasks which astronauts can accomplish in EVA are limited by spacesuit mobility. They are therefore relatively simple, from an ergonomic standpoint, requiring gross movements rather than time motor skills. The actual tasks include driving bolts, mating and demating electric and fluid connectors, and actuating levers; the important characteristics to be considered in design improvement include the ability of the astronaut to see and reach the item to be manipulated and the clearance required to accomplish the manipulation. This makes the tasks amenable to simulation in a Computer-Assisted Design (CAD) environment. For EVA, the spacesuited astronaut must have his or her feet attached on a work platform called a foot restraint to obtain a purchase against which work forces may be actuated. An important component of the design is therefore the proper placement of foot restraints.

  3. a Restoration Oriented Hbim System for Cultural Heritage Documentation: the Case Study of Parma Cathedral

    NASA Astrophysics Data System (ADS)

    Bruno, N.; Roncella, R.

    2018-05-01

    The need to safeguard and preserve Cultural Heritage (CH) is increasing and especially in Italy, where the amount of historical buildings is considerable, having efficient and standardized processes of CH management and conservation becomes strategic. At the time being, there are no tools capable of fulfilling all the specific functions required by Cultural Heritage documentation and, due to the complexity of historical assets, there are no solution as flexible and customizable as CH specific needs require. Nevertheless, BIM methodology can represent the most effective solution, on condition that proper methodologies, tools and functions are made available. The paper describes an ongoing research on the implementation of a Historical BIM system for the Parma cathedral, aimed at the maintenance, conservation and restoration. Its main goal was to give a concrete answer to the lack of specific tools required by Cultural Heritage documentation: organized and coordinated storage and management of historical data, easy analysis and query, time management, 3D modelling of irregular shapes, flexibility, user-friendliness, etc. The paper will describe the project and the implemented methodology, focusing mainly on survey and modelling phases. In describing the methodology, critical issues about the creation of a HBIM will be highlighted, trying to outline a workflow applicable also in other similar contexts.

  4. Maximizing Efficiency and Reducing Robotic Surgery Costs Using the NASA Task Load Index.

    PubMed

    Walters, Carrie; Webb, Paula J

    2017-10-01

    Perioperative leaders at our facility were struggling to meet efficiency targets for robotic surgery procedures while also maintaining the satisfaction of the surgical team. We developed a human resources time and motion study tool and used it in conjunction with the NASA Task Load Index to observe and analyze the required workload of personnel assigned to 25 robotic surgery procedures. The time and motion study identified opportunities to enlist the help of nonlicensed support personnel to ensure safe patient care and improve OR efficiency. Using the NASA Task Load Index demonstrated that high temporal, effort, and physical demands existed for personnel assisting with and performing robotic surgery. We believe that this process could be used to develop cost-effective staffing models, resulting in safe and efficient care for all surgical patients. Copyright © 2017 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  5. Study on detection geometry and detector shielding for portable PGNAA system using PHITS

    NASA Astrophysics Data System (ADS)

    Ithnin, H.; Dahing, L. N. S.; Lip, N. M.; Rashid, I. Q. Abd; Mohamad, E. J.

    2018-01-01

    Prompt gamma-ray neutron activation analysis (PGNAA) measurements require efficient detectors for gamma-ray detection. Apart from experimental studies, the Monte Carlo (MC) method has become one of the most popular tools in detector studies. The absolute efficiency for a 2 × 2 inch cylindrical Sodium Iodide (NaI) detector has been modelled using the PHITS software and compared with previous studies in literature. In the present work, PHITS code is used for optimization of portable PGNAA system using the validated NaI detector. The detection geometry is optimized by moving the detector along the sample to find the highest intensity of the prompt gamma generated from the sample. Shielding material for the validated NaI detector is also studied to find the best option for the PGNAA system setup. The result shows the optimum distance for detector is on the surface of the sample and around 15 cm from the source. The results specify that this process can be followed to determine the best setup for PGNAA system for a different sample size and detector type. It can be concluded that data from PHITS code is a strong tool not only for efficiency studies but also for optimization of PGNAA system.

  6. Optimized RNP transfection for highly efficient CRISPR/Cas9-mediated gene knockout in primary T cells.

    PubMed

    Seki, Akiko; Rutz, Sascha

    2018-03-05

    CRISPR (clustered, regularly interspaced, short palindromic repeats)/Cas9 (CRISPR-associated protein 9) has become the tool of choice for generating gene knockouts across a variety of species. The ability for efficient gene editing in primary T cells not only represents a valuable research tool to study gene function but also holds great promise for T cell-based immunotherapies, such as next-generation chimeric antigen receptor (CAR) T cells. Previous attempts to apply CRIPSR/Cas9 for gene editing in primary T cells have resulted in highly variable knockout efficiency and required T cell receptor (TCR) stimulation, thus largely precluding the study of genes involved in T cell activation or differentiation. Here, we describe an optimized approach for Cas9/RNP transfection of primary mouse and human T cells without TCR stimulation that results in near complete loss of target gene expression at the population level, mitigating the need for selection. We believe that this method will greatly extend the feasibly of target gene discovery and validation in primary T cells and simplify the gene editing process for next-generation immunotherapies. © 2018 Genentech.

  7. An efficient 3-D eddy-current solver using an independent impedance method for transcranial magnetic stimulation.

    PubMed

    De Geeter, Nele; Crevecoeur, Guillaume; Dupre, Luc

    2011-02-01

    In many important bioelectromagnetic problem settings, eddy-current simulations are required. Examples are the reduction of eddy-current artifacts in magnetic resonance imaging and techniques, whereby the eddy currents interact with the biological system, like the alteration of the neurophysiology due to transcranial magnetic stimulation (TMS). TMS has become an important tool for the diagnosis and treatment of neurological diseases and psychiatric disorders. A widely applied method for simulating the eddy currents is the impedance method (IM). However, this method has to contend with an ill conditioned problem and consequently a long convergence time. When dealing with optimal design problems and sensitivity control, the convergence rate becomes even more crucial since the eddy-current solver needs to be evaluated in an iterative loop. Therefore, we introduce an independent IM (IIM), which improves the conditionality and speeds up the numerical convergence. This paper shows how IIM is based on IM and what are the advantages. Moreover, the method is applied to the efficient simulation of TMS. The proposed IIM achieves superior convergence properties with high time efficiency, compared to the traditional IM and is therefore a useful tool for accurate and fast TMS simulations.

  8. Quality assurance planning for lunar Mars exploration

    NASA Technical Reports Server (NTRS)

    Myers, Kay

    1991-01-01

    A review is presented of the tools and techniques required to meet the challenge of total quality in the goal of traveling to Mars and returning to the moon. One program used by NASA to ensure the integrity of baselined requirements documents is configuration management (CM). CM is defined as an integrated management process that documents and identifies the functional and physical characteristics of a facility's systems, structures, computer software, and components. It also ensures that changes to these characteristics are properly assessed, developed, approved, implemented, verified, recorded, and incorporated into the facility's documentation. Three principal areas are discussed that will realize significant efficiencies and enhanced effectiveness, change assessment, change avoidance, and requirements management.

  9. DaMold: A data-mining platform for variant annotation and visualization in molecular diagnostics research.

    PubMed

    Pandey, Ram Vinay; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2017-07-01

    Next-generation sequencing (NGS) has become a powerful and efficient tool for routine mutation screening in clinical research. As each NGS test yields hundreds of variants, the current challenge is to meaningfully interpret the data and select potential candidates. Analyzing each variant while manually investigating several relevant databases to collect specific information is a cumbersome and time-consuming process, and it requires expertise and familiarity with these databases. Thus, a tool that can seamlessly annotate variants with clinically relevant databases under one common interface would be of great help for variant annotation, cross-referencing, and visualization. This tool would allow variants to be processed in an automated and high-throughput manner and facilitate the investigation of variants in several genome browsers. Several analysis tools are available for raw sequencing-read processing and variant identification, but an automated variant filtering, annotation, cross-referencing, and visualization tool is still lacking. To fulfill these requirements, we developed DaMold, a Web-based, user-friendly tool that can filter and annotate variants and can access and compile information from 37 resources. It is easy to use, provides flexible input options, and accepts variants from NGS and Sanger sequencing as well as hotspots in VCF and BED formats. DaMold is available as an online application at http://damold.platomics.com/index.html, and as a Docker container and virtual machine at https://sourceforge.net/projects/damold/. © 2017 Wiley Periodicals, Inc.

  10. Adaptive bill morphology for enhanced tool manipulation in New Caledonian crows

    PubMed Central

    Matsui, Hiroshi; Hunt, Gavin R.; Oberhofer, Katja; Ogihara, Naomichi; McGowan, Kevin J.; Mithraratne, Kumar; Yamasaki, Takeshi; Gray, Russell D.; Izawa, Ei-Ichi

    2016-01-01

    Early increased sophistication of human tools is thought to be underpinned by adaptive morphology for efficient tool manipulation. Such adaptive specialisation is unknown in nonhuman primates but may have evolved in the New Caledonian crow, which has sophisticated tool manufacture. The straightness of its bill, for example, may be adaptive for enhanced visually-directed use of tools. Here, we examine in detail the shape and internal structure of the New Caledonian crow’s bill using Principal Components Analysis and Computed Tomography within a comparative framework. We found that the bill has a combination of interrelated shape and structural features unique within Corvus, and possibly birds generally. The upper mandible is relatively deep and short with a straight cutting edge, and the lower mandible is strengthened and upturned. These novel combined attributes would be functional for (i) counteracting the unique loading patterns acting on the bill when manipulating tools, (ii) a strong precision grip to hold tools securely, and (iii) enhanced visually-guided tool use. Our findings indicate that the New Caledonian crow’s innovative bill has been adapted for tool manipulation to at least some degree. Early increased sophistication of tools may require the co-evolution of morphology that provides improved manipulatory skills. PMID:26955788

  11. A modified operational sequence methodology for zoo exhibit design and renovation: conceptualizing animals, staff, and visitors as interdependent coworkers.

    PubMed

    Kelling, Nicholas J; Gaalema, Diann E; Kelling, Angela S

    2014-01-01

    Human factors analyses have been used to improve efficiency and safety in various work environments. Although generally limited to humans, the universality of these analyses allows for their formal application to a much broader domain. This paper outlines a model for the use of human factors to enhance zoo exhibits and optimize spaces for all user groups; zoo animals, zoo visitors, and zoo staff members. Zoo exhibits are multi-faceted and each user group has a distinct set of requirements that can clash or complement each other. Careful analysis and a reframing of the three groups as interdependent coworkers can enhance safety, efficiency, and experience for all user groups. This paper details a general creation and specific examples of the use of the modified human factors tools of function allocation, operational sequence diagram and needs assessment. These tools allow for adaptability and ease of understanding in the design or renovation of exhibits. © 2014 Wiley Periodicals, Inc.

  12. Assessment of the Charging Policy in Energy Efficiency of the Enterprise

    NASA Astrophysics Data System (ADS)

    Shutov, E. A.; E Turukina, T.; Anisimov, T. S.

    2017-04-01

    The forecasting problem for energy facilities with a power exceeding 670 kW is currently one of the main. In connection with rules of the retail electricity market such customers also pay for actual energy consumption deviations from plan value. In compliance with the hierarchical stages of the electricity market a guaranteeing supplier is to respect the interests of distribution and generation companies that require load leveling. The answer to this question for industrial enterprise is possible only within technological process through implementation of energy-efficient processing chains with the adaptive function and forecasting tool. In such a circumstance the primary objective of a forecasting is reduce the energy consumption costs by taking account of the energy cost correlation for 24 hours for forming of pumping unit work schedule. The pumping unit virtual model with the variable frequency drive is considered. The forecasting tool and the optimizer are integrated into typical control circuit. Economic assessment of the optimization method was estimated.

  13. PLUM: Parallel Load Balancing for Unstructured Adaptive Meshes. Degree awarded by Colorado Univ.

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid

    1998-01-01

    Dynamic mesh adaption on unstructured grids is a powerful tool for computing large-scale problems that require grid modifications to efficiently resolve solution features. By locally refining and coarsening the mesh to capture physical phenomena of interest, such procedures make standard computational methods more cost effective. Unfortunately, an efficient parallel implementation of these adaptive methods is rather difficult to achieve, primarily due to the load imbalance created by the dynamically-changing nonuniform grid. This requires significant communication at runtime, leading to idle processors and adversely affecting the total execution time. Nonetheless, it is generally thought that unstructured adaptive- grid techniques will constitute a significant fraction of future high-performance supercomputing. Various dynamic load balancing methods have been reported to date; however, most of them either lack a global view of loads across processors or do not apply their techniques to realistic large-scale applications.

  14. SandiaMRCR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-01-05

    SandiaMCR was developed to identify pure components and their concentrations from spectral data. This software efficiently implements the multivariate calibration regression alternating least squares (MCR-ALS), principal component analysis (PCA), and singular value decomposition (SVD). Version 3.37 also includes the PARAFAC-ALS Tucker-1 (for trilinear analysis) algorithms. The alternating least squares methods can be used to determine the composition without or with incomplete prior information on the constituents and their concentrations. It allows the specification of numerous preprocessing, initialization and data selection and compression options for the efficient processing of large data sets. The software includes numerous options including the definition ofmore » equality and non-negativety constraints to realistically restrict the solution set, various normalization or weighting options based on the statistics of the data, several initialization choices and data compression. The software has been designed to provide a practicing spectroscopist the tools required to routinely analysis data in a reasonable time and without requiring expert intervention.« less

  15. Old and new techniques mixed up into optical photomask measurement method

    NASA Astrophysics Data System (ADS)

    Fukui, Jumpei; Tachibana, Yusaku; Osanai, Makoto

    2017-07-01

    It has been still highly required for cost efficient solution with easy operation for full-automated CD measurement for line width about 500nm up to 5μm on photomask, because it is frequently use such photomask in the process of manufacturing MEMS sensor for IoT and some devices made in BCD (Bipola CMOS DMOS). As reply to such demand in photomask manufacturing field, we try to take a low noise digital camera technology and LED light source for i-line, which are recently developed, into new measuring tool in order to achieve 1nm (3σ) repeatability for line width measurement between 300nm to 10μm. In addition, for the purpose of full-automated operation, it is very important to find where an initial target line in dense pattern. To achieve such auto line detection precisely, we have improved accuracy of high precision stage (20nm as 3σ) and an alignment algorithm of MEMS Stepper to combine with this tool. As for user-friendly interface, Windows based software helps a lot for not only the operation but also recipe creation or edition in Excel. Actually, in the MEMS manufacturing process, there are various photomasks which need to be check and measure frequently therefore various recipe files are also have to be created and edited frequently.. In order to meet such a requirement in photomask management, we try to make it true by mixing old and new techniques together into one system, which comes to fully automated and cost efficient tool with 1nm repeatability in CD measurement.

  16. Measurement of W + bb and a search for MSSM Higgs bosons with the CMS detector at the LHC

    NASA Astrophysics Data System (ADS)

    O'Connor, Alexander Pinpin

    Tooling used to cure composite laminates in the aerospace and automotive industries must provide a dimensionally stable geometry throughout the thermal cycle applied during the part curing process. This requires that the Coefficient of Thermal Expansion (CTE) of the tooling materials match that of the composite being cured. The traditional tooling material for production applications is a nickel alloy. Poor machinability and high material costs increase the expense of metallic tooling made from nickel alloys such as 'Invar 36' or 'Invar 42'. Currently, metallic tooling is unable to meet the needs of applications requiring rapid affordable tooling solutions. In applications where the tooling is not required to have the durability provided by metals, such as for small area repair, an opportunity exists for non-metallic tooling materials like graphite, carbon foams, composites, or ceramics and machinable glasses. Nevertheless, efficient machining of brittle, non-metallic materials is challenging due to low ductility, porosity, and high hardness. The machining of a layup tool comprises a large portion of the final cost. Achieving maximum process economy requires optimization of the machining process in the given tooling material. Therefore, machinability of the tooling material is a critical aspect of the overall cost of the tool. In this work, three commercially available, brittle/porous, non-metallic candidate tooling materials were selected, namely: (AAC) Autoclaved Aerated Concrete, CB1100 ceramic block and Cfoam carbon foam. Machining tests were conducted in order to evaluate the machinability of these materials using end milling. Chip formation, cutting forces, cutting tool wear, machining induced damage, surface quality and surface integrity were investigated using High Speed Steel (HSS), carbide, diamond abrasive and Polycrystalline Diamond (PCD) cutting tools. Cutting forces were found to be random in magnitude, which was a result of material porosity. The abrasive nature of Cfoam produced rapid tool wear when using HSS and PCD type cutting tools. However, tool wear was not significant in AAC or CB1100 regardless of the type of cutting edge. Machining induced damage was observed in the form of macro-scale chipping and fracture in combination with micro-scale cracking. Transverse rupture test results revealed significant reductions in residual strength and damage tolerance in CB1100. In contrast, AAC and Cfoam showed no correlation between machining induced damage and a reduction in surface integrity. Cutting forces in machining were modeled for all materials. Cutting force regression models were developed based on Design of Experiment and Analysis of Variance. A mechanistic cutting force model was proposed based upon conventional end milling force models and statistical distributions of material porosity. In order to validate the model, predicted cutting forces were compared to experimental results. Predicted cutting forces agreed well with experimental measurements. Furthermore, over the range of cutting conditions tested, the proposed model was shown to have comparable predictive accuracy to empirically produced regression models; greatly reducing the number of cutting tests required to simulate cutting forces. Further, this work demonstrates a key adaptation of metallic cutting force models to brittle porous material; a vital step in the research into the machining of these materials using end milling.

  17. Ultra-low loss fully-etched grating couplers for perfectly vertical coupling compatible with DUV lithography tools

    NASA Astrophysics Data System (ADS)

    Dabos, G.; Pleros, N.; Tsiokos, D.

    2016-03-01

    Hybrid integration of VCSELs onto silicon-on-insulator (SOI) substrates has emerged as an attractive approach for bridging the gap between cost-effective and energy-efficient directly modulated laser sources and silicon-based PICs by leveraging flip-chip (FC) bonding techniques and silicon grating couplers (GCs). In this context, silicon GCs, should comply with the process requirements imposed by the complimentary-metal-oxide-semiconductor manufacturing tools addressing in parallel the challenges originating from the perfectly vertical incidence. Firstly, fully etched GCs compatible with deep-ultraviolet lithography tools offering high coupling efficiencies are imperatively needed to maintain low fabrication cost. Secondly, GC's tolerance to VCSEL bonding misalignment errors is a prerequisite for practical deployment. Finally, a major challenge originating from the perfectly vertical coupling scheme is the minimization of the direct back-reflection to the VCSEL's outgoing facet which may destabilize its operation. Motivated from the above challenges, we used numerical simulation tools to design an ultra-low loss, bidirectional VCSEL-to-SOI optical coupling scheme for either TE or TM polarization, based on low-cost fully etched GCs with a Si-layer of 340 nm without employing bottom reflectors or optimizing the buried-oxide layer. Comprehensive 2D Finite-Difference-Time- Domain simulations have been performed. The reported GC layout remains fully compatible with the back-end-of-line (BEOL) stack associated with the 3D integration technology exploiting all the inter-metal-dielectric (IMD) layers of the CMOS fab. Simulation results predicted for the first time in fully etched structures a coupling efficiency of as low as -0.87 dB at 1548 nm and -1.47 dB at 1560 nm with a minimum direct back-reflection of -27.4 dB and -14.2 dB for TE and TM polarization, respectively.

  18. A finite element analysis modeling tool for solid oxide fuel cell development: coupled electrochemistry, thermal and flow analysis in MARC®

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khaleel, Mohammad A.; Lin, Zijing; Singh, Prabhakar

    2004-05-03

    A 3D simulation tool for modeling solid oxide fuel cells is described. The tool combines the versatility and efficiency of a commercial finite element analysis code, MARC{reg_sign}, with an in-house developed robust and flexible electrochemical (EC) module. Based upon characteristic parameters obtained experimentally and assigned by the user, the EC module calculates the current density distribution, heat generation, and fuel and oxidant species concentration, taking the temperature profile provided by MARC{reg_sign} and operating conditions such as the fuel and oxidant flow rate and the total stack output voltage or current as the input. MARC{reg_sign} performs flow and thermal analyses basedmore » on the initial and boundary thermal and flow conditions and the heat generation calculated by the EC module. The main coupling between MARC{reg_sign} and EC is for MARC{reg_sign} to supply the temperature field to EC and for EC to give the heat generation profile to MARC{reg_sign}. The loosely coupled, iterative scheme is advantageous in terms of memory requirement, numerical stability and computational efficiency. The coupling is iterated to self-consistency for a steady-state solution. Sample results for steady states as well as the startup process for stacks with different flow designs are presented to illustrate the modeling capability and numerical performance characteristic of the simulation tool.« less

  19. Technical Requirements Analysis and Control Systems (TRACS) Initial Operating Capability (IOC) documentation

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.

    1991-01-01

    The Technical Requirements Analysis and Control Systems (TRACS) software package is described. TRACS offers supplemental tools for the analysis, control, and interchange of project requirements. This package provides the fundamental capability to analyze and control requirements, serves a focal point for project requirements, and integrates a system that supports efficient and consistent operations. TRACS uses relational data base technology (ORACLE) in a stand alone or in a distributed environment that can be used to coordinate the activities required to support a project through its entire life cycle. TRACS uses a set of keyword and mouse driven screens (HyperCard) which imposes adherence through a controlled user interface. The user interface provides an interactive capability to interrogate the data base and to display or print project requirement information. TRACS has a limited report capability, but can be extended with PostScript conventions.

  20. CFD - Mature Technology?

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2005-01-01

    Over the past 30 years, numerical methods and simulation tools for fluid dynamic problems have advanced as a new discipline, namely, computational fluid dynamics (CFD). Although a wide spectrum of flow regimes are encountered in many areas of science and engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to a large demand for predicting the aerodynamic performance characteristics of flight vehicles, such as commercial, military, and space vehicles. As flow analysis is required to be more accurate and computationally efficient for both commercial and mission-oriented applications (such as those encountered in meteorology, aerospace vehicle development, general fluid engineering and biofluid analysis) CFD tools for engineering become increasingly important for predicting safety, performance and cost. This paper presents the author's perspective on the maturity of CFD, especially from an aerospace engineering point of view.

  1. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  2. From field notes to data portal - An operational QA/QC framework for tower networks

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.

    2016-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.

  3. Points of attention in designing tools for regional brownfield prioritization.

    PubMed

    Limasset, Elsa; Pizzol, Lisa; Merly, Corinne; Gatchett, Annette M; Le Guern, Cécile; Martinát, Stanislav; Klusáček, Petr; Bartke, Stephan

    2018-05-01

    The regeneration of brownfields has been increasingly recognized as a key instrument in sustainable land management, since free developable land (or so called "greenfields") has become a scare and more expensive resource, especially in densely populated areas. However, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects, thus requiring the development and application of appropriate tools to support decision makers in the selection of promising sites where efficiently allocate the limited financial resources. The design of effective prioritization tools is a complex process, which requires the analysis and consideration of critical points of attention (PoAs) which has been identified considering the state of the art in literature, and lessons learned from previous developments of regional brownfield (BF) prioritization processes, frameworks and tools. Accordingly, we identified 5 PoAs, namely 1) Assessing end user needs and orientation discussions, 2) Availability and quality of the data needed for the BF prioritization tool, 3) Communication and stakeholder engagement 4) Drivers of regeneration success, and 5) Financing and application costs. To deepen and collate the most recent knowledge on the topics from scientists and practitioners, we organized a focus group discussion within a special session at the AquaConSoil (ACS) conference 2017, where participants were asked to add their experience and thoughts to the discussion in order to identify the most significant and urgent points of attention in BF prioritization tool design. The result of this assessment is a comprehensive table (Table 2), which can support problem owners, investors, service providers, regulators, public and private land managers, decision makers etc. in the identification of the main aspects (sub-topics) to be considered and their relative influences and in the comprehension of the general patterns and challenges to be faced when dealing with the development of BF prioritization tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Validating Signs and Symptoms From An Actual Mass Casualty Incident to Characterize An Irritant Gas Syndrome Agent (IGSA) Exposure: A First Step in The Development of a Novel IGSA Triage Algorithm.

    PubMed

    Culley, Joan M; Richter, Jane; Donevant, Sara; Tavakoli, Abbas; Craig, Jean; DiNardi, Salvatore

    2017-07-01

    • Chemical exposures daily pose a significant threat to life. Rapid assessment by first responders/emergency nurses is required to reduce death and disability. Currently, no informatics tools for Irritant Gas Syndrome Agents (IGSA) exposures exist to process victims efficiently, continuously monitor for latent signs/symptoms, or make triage recommendations. • This study uses actual patient data from a chemical incident to characterize and validate signs/symptoms of an IGSA Syndrome. Validating signs/symptoms is the first step in developing new emergency department informatics tools with the potential to revolutionize the process by which emergency nurses manage triage victims of chemical incidents. Chemical exposures can pose a significant threat to life. Rapid assessment by first responders/emergency nurses is required to reduce death and disability. Currently, no informatics tools for irritant gas syndrome agents (IGSA) exposures exist to process victims efficiently, continuously monitor for latent signs/symptoms, or make triage recommendations. This study describes the first step in developing ED informatics tools for chemical incidents: validation of signs/symptoms that characterize an IGSA syndrome. Data abstracted from 146 patients treated for chlorine exposure in one emergency department during a 2005 train derailment and 152 patients not exposed to chlorine (a comparison group) were mapped to 93 possible signs/symptoms within 2 tools (WISER and CHEMM-IST) designed to assist emergency responders/emergency nurses with managing hazardous material exposures. Inferential statistics (χ 2 /Fisher's exact test) and diagnostics tests were used to examine mapped signs/symptoms of persons who were and were not exposed to chlorine. Three clusters of signs/symptoms are statistically associated with an IGSA syndrome (P < .01): respiratory (shortness of breath, wheezing, coughing, and choking); chest discomfort (tightness, pain, and burning), and eye, nose and/or throat (pain, irritation, and burning). The syndrome requires the presence of signs/symptoms from at least 2 of these clusters. The latency period must also be considered for exposed/potentially exposed persons. This study uses actual patient data from a chemical incident to characterize and validate signs/symptoms of an IGSA syndrome. Validating signs/symptoms is the first step in developing new ED informatics tools with the potential to revolutionize the process by which emergency nurses manage triage victims of chemical incidents. Copyright © 2017 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved.

  5. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  6. Evaluation of a Dispatcher's Route Optimization Decision Aid to Avoid Aviation Weather Hazards

    NASA Technical Reports Server (NTRS)

    Dorneich, Michael C.; Olofinboba, Olu; Pratt, Steve; Osborne, Dannielle; Feyereisen, Thea; Latorella, Kara

    2003-01-01

    This document describes the results and analysis of the formal evaluation plan for the Honeywell software tool developed under the NASA AWIN (Aviation Weather Information) 'Weather Avoidance using Route Optimization as a Decision Aid' project. The software tool aims to provide airline dispatchers with a decision aid for selecting optimal routes that avoid weather and other hazards. This evaluation compares and contrasts route selection performance with the AWIN tool to that of subjects using a more traditional dispatcher environment. The evaluation assesses gains in safety, in fuel efficiency of planned routes, and in time efficiency in the pre-flight dispatch process through the use of the AWIN decision aid. In addition, we are interested in how this AWIN tool affects constructs that can be related to performance. The construct of Situation Awareness (SA), workload, trust in an information system, and operator acceptance are assessed using established scales, where these exist, as well as through the evaluation of questionnaire responses and subject comments. The intention of the experiment is to set up a simulated operations area for the dispatchers to work in. They will be given scenarios in which they are presented with stored company routes for a particular city-pair and aircraft type. A diverse set of external weather information sources is represented by a stand-alone display (MOCK), containing the actual historical weather data typically used by dispatchers. There is also the possibility of presenting selected weather data on the route visualization tool. The company routes have not been modified to avoid the weather except in the case of one additional route generated by the Honeywell prototype flight planning system. The dispatcher will be required to choose the most appropriate and efficient flight plan route in the displayed weather conditions. The route may be modified manually or may be chosen from those automatically displayed.

  7. EconoMe-Develop - a calculation tool for multi-risk assessment and benefit-cost-analysis

    NASA Astrophysics Data System (ADS)

    Bründl, M.

    2012-04-01

    Public money is used to finance the protection of human life, material assets and the environment against natural hazards. This limited resource should be used in a way that it achieves the maximum possible effect by minimizing as many risks as possible. Hence, decision-makers are facing the question which mitigation measures should be prioritised. Benefit-Cost-Analysis (BCA) is a recognized method for determining the economic efficiency of investments in mitigation measures. In Switzerland, the Federal Office for the Environment (FOEN) judges the benefit-cost-ratio of mitigation projects on the base of the results of the calculation tool "EconoMe" [1]. The check of the economic efficiency of mitigation projects with an investment of more than 1 million CHF (800,000 EUR) by using "EconoMe" is mandatory since 2008 in Switzerland. Within "EconoMe", most calculation parameters cannot be changed by the user allowing for comparable results. Based on the risk guideline "RIKO" [2] an extended version of the operational version of "EconoMe", called "EconoMe-Develop" was developed. "EconoMe-Develop" is able to deal with various natural hazard processes and thus allows multi-risk assessments, since all restrictions of the operational version of "EconoMe" like e.g. the number of scenarios and expositions, vulnerability, spatial probability of processes and probability of presence of objects, are not existing. Additionally, the influences of uncertainty of calculation factors, like e.g. vulnerability, on the final results can be determined. "EconoMe-Develop" offers import and export of data, e.g. results of GIS-analysis. The possibility for adapting the tool to user specific requirements makes EconoMe-Develop an easy-to-use tool for risk assessment and assessment of economic efficiency of mitigation projects for risk experts. In the paper we will present the most important features of the tool and we will illustrate the application by a practical example.

  8. Cooperative problem solving with personal mobile information tools in hospitals.

    PubMed

    Buchauer, A; Werner, R; Haux, R

    1998-01-01

    Health-care professionals have a broad range of needs for information and cooperation while working at different points of care (e.g., outpatient departments, wards, and functional units such as operating theaters). Patient-related data and medical knowledge have to be widely available to support high-quality patient care. Furthermore, due to the increased specialization of health-care professionals, efficient collaboration is required. Personal mobile information tools have a considerable potential to realize almost ubiquitous information and collaborative support. They enable to unite the functionality of conventional tools such as paper forms, dictating machines, and pagers into one tool. Moreover, they can extend the support already provided by clinical workstations. An approach is described for the integration of mobile information tools with heterogeneous hospital information systems. This approach includes identification of functions which should be provided on mobile tools. Major functions are the presentation of medical records and reports, electronic mailing to support interpersonal communication, and the provision of editors for structured clinical documentation. To realize those functions on mobile tools, we propose a document-based client-server architecture that enables mobile information tools to interoperate with existing computer-based application systems. Open application systems and powerful, partially wireless, hospital-wide networks are the prerequisites for the introduction of mobile information tools.

  9. CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000

    DTIC Science & Technology

    2000-06-01

    Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S

  10. REopt Improves the Operations of Alcatraz's Solar PV-Battery-Diesel Hybrid System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olis, Daniel R; Walker, H. A; Van Geet, Otto D

    This poster identifies operations improvement strategies for a photovoltaic (PV)-battery-diesel hybrid system at the National Park Service's Alcatraz Island using NREL's REopt analysis tool. The current 'cycle charging' strategy results in significant curtailing of energy production from the PV array, requiring excessive diesel use, while also incurring high wear on batteries without benefit of improved efficiency. A simple 'load following' strategy results in near optimal operating cost reduction.

  11. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  12. Open Source GIS based integrated watershed management

    NASA Astrophysics Data System (ADS)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address challenging resource management issues in industry, government and nongovernmental agencies. Current research and analysis tools were developed to manage meteorological, climatological, and land and water resource data efficiently at high resolution in space and time. The deliverable for this work is a Whitebox-GENESYS open-source resource management capacity with routines for GIS based watershed management including water in agriculture and food production. We are adding urban water management routines through GENESYS in 2013-15 with an engineering PhD candidate. Both Whitebox-GAT and GENESYS are already well-established tools. The proposed research will combine these products to create an open-source geomatics based water resource management tool that is revolutionary in both capacity and availability to a wide array of Canadian and global users

  13. A new framework for sustainable hydropower development project

    NASA Astrophysics Data System (ADS)

    Johan, Kartina; Turan, Faiz Mohd; Gani, Nur Syazwani Abdul

    2018-03-01

    This project studies on the establishment of a new framework for sustainable hydropower development. A hydropower development is listed as one of the prescribed activities under the Environmental Quality Order 1987. Thus, Environmental Impact Assessment (EIA) guidelines must be referred to comply with the Department of Environment (DoE) requirements. In order to execute EIA, an assessment tool that will be utilized in the final evaluation phase must be determined. The selected assessment tool that will be used is Systematic Sustainability Assessment(SSA) which is a new integrated tool to evaluate the sustainability performance. A pilot run is conducted in five different departments within the Energy Company to validate the efficiency of the SSA tool. The parameters to be evaluated are constructed aligned with the Sustainable Development Goals (SDG) to maintain the sustainability features. Consequently, the performance level of the sustainability with respect to People, Planet and Profit (3P’s) is able to be discovered during evaluation phase in the hydropower development for continuous improvement.

  14. ARX - A Comprehensive Tool for Anonymizing Biomedical Data

    PubMed Central

    Prasser, Fabian; Kohlmayer, Florian; Lautenschläger, Ronald; Kuhn, Klaus A.

    2014-01-01

    Collaboration and data sharing have become core elements of biomedical research. Especially when sensitive data from distributed sources are linked, privacy threats have to be considered. Statistical disclosure control allows the protection of sensitive data by introducing fuzziness. Reduction of data quality, however, needs to be balanced against gains in protection. Therefore, tools are needed which provide a good overview of the anonymization process to those responsible for data sharing. These tools require graphical interfaces and the use of intuitive and replicable methods. In addition, extensive testing, documentation and openness to reviews by the community are important. Existing publicly available software is limited in functionality, and often active support is lacking. We present ARX, an anonymization tool that i) implements a wide variety of privacy methods in a highly efficient manner, ii) provides an intuitive cross-platform graphical interface, iii) offers a programming interface for integration into other software systems, and iv) is well documented and actively supported. PMID:25954407

  15. Environmental Requirements Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cusack, Laura J.; Bramson, Jeffrey E.; Archuleta, Jose A.

    2015-01-08

    CH2M HILL Plateau Remediation Company (CH2M HILL) is the U.S. Department of Energy (DOE) prime contractor responsible for the environmental cleanup of the Hanford Site Central Plateau. As part of this responsibility, the CH2M HILL is faced with the task of complying with thousands of environmental requirements which originate from over 200 federal, state, and local laws and regulations, DOE Orders, waste management and effluent discharge permits, Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) response and Resource Conservation and Recovery Act (RCRA) corrective action documents, and official regulatory agency correspondence. The challenge is to manage this vast number ofmore » requirements to ensure they are appropriately and effectively integrated into CH2M HILL operations. Ensuring compliance with a large number of environmental requirements relies on an organization’s ability to identify, evaluate, communicate, and verify those requirements. To ensure that compliance is maintained, all changes need to be tracked. The CH2M HILL identified that the existing system used to manage environmental requirements was difficult to maintain and that improvements should be made to increase functionality. CH2M HILL established an environmental requirements management procedure and tools to assure that all environmental requirements are effectively and efficiently managed. Having a complete and accurate set of environmental requirements applicable to CH2M HILL operations will promote a more efficient approach to: • Communicating requirements • Planning work • Maintaining work controls • Maintaining compliance« less

  16. A communication channel model of the software process

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1988-01-01

    Reported here is beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size), the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. Also derived is an upper bound to productivity that shows that software reuse is the only means than can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.

  17. A communication channel model of the software process

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1988-01-01

    Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.

  18. Electron-muon ranger: performance in the MICE muon beam

    NASA Astrophysics Data System (ADS)

    Adams, D.; Alekou, A.; Apollonio, M.; Asfandiyarov, R.; Barber, G.; Barclay, P.; de Bari, A.; Bayes, R.; Bayliss, V.; Bene, P.; Bertoni, R.; Blackmore, V. J.; Blondel, A.; Blot, S.; Bogomilov, M.; Bonesini, M.; Booth, C. N.; Bowring, D.; Boyd, S.; Bradshaw, T. W.; Bravar, U.; Bross, A. D.; Cadoux, F.; Capponi, M.; Carlisle, T.; Cecchet, G.; Charnley, C.; Chignoli, F.; Cline, D.; Cobb, J. H.; Colling, G.; Collomb, N.; Coney, L.; Cooke, P.; Courthold, M.; Cremaldi, L. M.; Debieux, S.; DeMello, A.; Dick, A.; Dobbs, A.; Dornan, P.; Drielsma, F.; Filthaut, F.; Fitzpatrick, T.; Franchini, P.; Francis, V.; Fry, L.; Gallagher, A.; Gamet, R.; Gardener, R.; Gourlay, S.; Grant, A.; Graulich, J. S.; Greis, J.; Griffiths, S.; Hanlet, P.; Hansen, O. M.; Hanson, G. G.; Hart, T. L.; Hartnett, T.; Hayler, T.; Heidt, C.; Hills, M.; Hodgson, P.; Hunt, C.; Husi, C.; Iaciofano, A.; Ishimoto, S.; Kafka, G.; Kaplan, D. M.; Karadzhov, Y.; Kim, Y. K.; Kuno, Y.; Kyberd, P.; Lagrange, J.-B.; Langlands, J.; Lau, W.; Leonova, M.; Li, D.; Lintern, A.; Littlefield, M.; Long, K.; Luo, T.; Macwaters, C.; Martlew, B.; Martyniak, J.; Masciocchi, F.; Mazza, R.; Middleton, S.; Moretti, A.; Moss, A.; Muir, A.; Mullacrane, I.; Nebrensky, J. J.; Neuffer, D.; Nichols, A.; Nicholson, R.; Nicola, L.; Noah Messomo, E.; Nugent, J. C.; Oates, A.; Onel, Y.; Orestano, D.; Overton, E.; Owens, P.; Palladino, V.; Pasternak, J.; Pastore, F.; Pidcott, C.; Popovic, M.; Preece, R.; Prestemon, S.; Rajaram, D.; Ramberger, S.; Rayner, M. A.; Ricciardi, S.; Roberts, T. J.; Robinson, M.; Rogers, C.; Ronald, K.; Rothenfusser, K.; Rubinov, P.; Rucinski, P.; Sakamato, H.; Sanders, D. A.; Sandström, R.; Santos, E.; Savidge, T.; Smith, P. J.; Snopok, P.; Soler, F. J. P.; Speirs, D.; Stanley, T.; Stokes, G.; Summers, D. J.; Tarrant, J.; Taylor, I.; Tortora, L.; Torun, Y.; Tsenov, R.; Tunnell, C. D.; Uchida, M. A.; Vankova-Kirilova, G.; Virostek, S.; Vretenar, M.; Warburton, P.; Watson, S.; White, C.; Whyte, C. G.; Wilson, A.; Wisting, H.; Yang, X.; Young, A.; Zisman, M.

    2015-12-01

    The Muon Ionization Cooling Experiment (MICE) will perform a detailed study of ionization cooling to evaluate the feasibility of the technique. To carry out this program, MICE requires an efficient particle-identification (PID) system to identify muons. The Electron-Muon Ranger (EMR) is a fully-active tracking-calorimeter that forms part of the PID system and tags muons that traverse the cooling channel without decaying. The detector is capable of identifying electrons with an efficiency of 98.6%, providing a purity for the MICE beam that exceeds 99.8%. The EMR also proved to be a powerful tool for the reconstruction of muon momenta in the range 100-280 MeV/c.

  19. Demonstration Of Ultra HI-FI (UHF) Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.

    2004-01-01

    Computational aero-acoustics (CAA) requires efficient, high-resolution simulation tools. Most current techniques utilize finite-difference approaches because high order accuracy is considered too difficult or expensive to achieve with finite volume or finite element methods. However, a novel finite volume approach (Ultra HI-FI or UHF) which utilizes Hermite fluxes is presented which can achieve both arbitrary accuracy and fidelity in space and time. The technique can be applied to unstructured grids with some loss of fidelity or with multi-block structured grids for maximum efficiency and resolution. In either paradigm, it is possible to resolve ultra-short waves (less than 2 PPW). This is demonstrated here by solving the 4th CAA workshop Category 1 Problem 1.

  20. Design Optimization of a Variable-Speed Power Turbine

    NASA Technical Reports Server (NTRS)

    Hendricks, Eric S.; Jones, Scott M.; Gray, Justin S.

    2014-01-01

    NASA's Rotary Wing Project is investigating technologies that will enable the development of revolutionary civil tilt rotor aircraft. Previous studies have shown that for large tilt rotor aircraft to be viable, the rotor speeds need to be slowed significantly during the cruise portion of the flight. This requirement to slow the rotors during cruise presents an interesting challenge to the propulsion system designer as efficient engine performance must be achieved at two drastically different operating conditions. One potential solution to this challenge is to use a transmission with multiple gear ratios and shift to the appropriate ratio during flight. This solution will require a large transmission that is likely to be maintenance intensive and will require a complex shifting procedure to maintain power to the rotors at all times. An alternative solution is to use a fixed gear ratio transmission and require the power turbine to operate efficiently over the entire speed range. This concept is referred to as a variable-speed power-turbine (VSPT) and is the focus of the current study. This paper explores the design of a variable speed power turbine for civil tilt rotor applications using design optimization techniques applied to NASA's new meanline tool, the Object-Oriented Turbomachinery Analysis Code (OTAC).

  1. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    DTIC Science & Technology

    2015-10-05

    simulation tool , CREATETM-AV Helios [12-14], a high fidelity rotary wing vehicle simulation tool , and CREATETM-AV DaVinci [15-16], a conceptual through...05/2015 Oct 2008-Sep 2015 A Fixed-Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed-wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows

  2. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  3. Computer modeling in the practice of acoustical consulting: An evolving variety of uses from marketing and diagnosis through design to eventually research

    NASA Astrophysics Data System (ADS)

    Madaras, Gary S.

    2002-05-01

    The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.

  4. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  5. Detailed Modeling of Physical Processes in Electron Sources for Accelerator Applications

    NASA Astrophysics Data System (ADS)

    Chubenko, Oksana; Afanasev, Andrei

    2017-01-01

    At present, electron sources are essential in a wide range of applications - from common technical use to exploring the nature of matter. Depending on the application requirements, different methods and materials are used to generate electrons. State-of-the-art accelerator applications set a number of often-conflicting requirements for electron sources (e.g., quantum efficiency vs. polarization, current density vs. lifetime, etc). Development of advanced electron sources includes modeling and design of cathodes, material growth, fabrication of cathodes, and cathode testing. The detailed simulation and modeling of physical processes is required in order to shed light on the exact mechanisms of electron emission and to develop new-generation electron sources with optimized efficiency. The purpose of the present work is to study physical processes in advanced electron sources and develop scientific tools, which could be used to predict electron emission from novel nano-structured materials. In particular, the area of interest includes bulk/superlattice gallium arsenide (bulk/SL GaAs) photo-emitters and nitrogen-incorporated ultrananocrystalline diamond ((N)UNCD) photo/field-emitters. Work supported by The George Washington University and Euclid TechLabs LLC.

  6. How to Compute a Slot Marker - Calculation of Controller Managed Spacing Tools for Efficient Descents with Precision Scheduling

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas

    2012-01-01

    This paper describes the underlying principles and algorithms for computing the primary controller managed spacing (CMS) tools developed at NASA for precisely spacing aircraft along efficient descent paths. The trajectory-based CMS tools include slot markers, delay indications and speed advisories. These tools are one of three core NASA technologies integrated in NASAs ATM technology demonstration-1 (ATD-1) that will operationally demonstrate the feasibility of fuel-efficient, high throughput arrival operations using Automatic Dependent Surveillance Broadcast (ADS-B) and ground-based and airborne NASA technologies for precision scheduling and spacing.

  7. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  8. Novel combined patient instruction and discharge summary tool improves timeliness of documentation and outpatient provider satisfaction

    PubMed Central

    Gilliam, Meredith; Krein, Sarah L; Belanger, Karen; Fowler, Karen E; Dimcheff, Derek E; Solomon, Gabriel

    2017-01-01

    Background: Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. Objective: To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. Methods: In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. Results: The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04), and to 2.8 h in 2015 (p < 0.001). Conclusion: The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process. PMID:28491308

  9. Novel combined patient instruction and discharge summary tool improves timeliness of documentation and outpatient provider satisfaction.

    PubMed

    Gilliam, Meredith; Krein, Sarah L; Belanger, Karen; Fowler, Karen E; Dimcheff, Derek E; Solomon, Gabriel

    2017-01-01

    Incomplete or delayed access to discharge information by outpatient providers and patients contributes to discontinuity of care and poor outcomes. To evaluate the effect of a new electronic discharge summary tool on the timeliness of documentation and communication with outpatient providers. In June 2012, we implemented an electronic discharge summary tool at our 145-bed university-affiliated Veterans Affairs hospital. The tool facilitates completion of a comprehensive discharge summary note that is available for patients and outpatient medical providers at the time of hospital discharge. Discharge summary note availability, outpatient provider satisfaction, and time between the decision to discharge a patient and discharge note completion were all evaluated before and after implementation of the tool. The percentage of discharge summary notes completed by the time of first post-discharge clinical contact improved from 43% in February 2012 to 100% in September 2012 and was maintained at 100% in 2014. A survey of 22 outpatient providers showed that 90% preferred the new summary and 86% found it comprehensive. Despite increasing required documentation, the time required to discharge a patient, from physician decision to discharge note completion, improved from 5.6 h in 2010 to 4.1 h in 2012 (p = 0.04), and to 2.8 h in 2015 (p < 0.001). The implementation of a novel discharge summary tool improved the timeliness and comprehensiveness of discharge information as needed for the delivery of appropriate, high-quality follow-up care, without adversely affecting the efficiency of the discharge process.

  10. Optical modeling of waveguide coupled TES detectors towards the SAFARI instrument for SPICA

    NASA Astrophysics Data System (ADS)

    Trappe, N.; Bracken, C.; Doherty, S.; Gao, J. R.; Glowacka, D.; Goldie, D.; Griffin, D.; Hijmering, R.; Jackson, B.; Khosropanah, P.; Mauskopf, P.; Morozov, D.; Murphy, A.; O'Sullivan, C.; Ridder, M.; Withington, S.

    2012-09-01

    The next generation of space missions targeting far-infrared wavelengths will require large-format arrays of extremely sensitive detectors. The development of Transition Edge Sensor (TES) array technology is being developed for future Far-Infrared (FIR) space applications such as the SAFARI instrument for SPICA where low-noise and high sensitivity is required to achieve ambitious science goals. In this paper we describe a modal analysis of multi-moded horn antennas feeding integrating cavities housing TES detectors with superconducting film absorbers. In high sensitivity TES detector technology the ability to control the electromagnetic and thermo-mechanical environment of the detector is critical. Simulating and understanding optical behaviour of such detectors at far IR wavelengths is difficult and requires development of existing analysis tools. The proposed modal approach offers a computationally efficient technique to describe the partial coherent response of the full pixel in terms of optical efficiency and power leakage between pixels. Initial wok carried out as part of an ESA technical research project on optical analysis is described and a prototype SAFARI pixel design is analyzed where the optical coupling between the incoming field and the pixel containing horn, cavity with an air gap, and thin absorber layer are all included in the model to allow a comprehensive optical characterization. The modal approach described is based on the mode matching technique where the horn and cavity are described in the traditional way while a technique to include the absorber was developed. Radiation leakage between pixels is also included making this a powerful analysis tool.

  11. Proteinortho: Detection of (Co-)orthologs in large-scale analysis

    PubMed Central

    2011-01-01

    Background Orthology analysis is an important part of data analysis in many areas of bioinformatics such as comparative genomics and molecular phylogenetics. The ever-increasing flood of sequence data, and hence the rapidly increasing number of genomes that can be compared simultaneously, calls for efficient software tools as brute-force approaches with quadratic memory requirements become infeasible in practise. The rapid pace at which new data become available, furthermore, makes it desirable to compute genome-wide orthology relations for a given dataset rather than relying on relations listed in databases. Results The program Proteinortho described here is a stand-alone tool that is geared towards large datasets and makes use of distributed computing techniques when run on multi-core hardware. It implements an extended version of the reciprocal best alignment heuristic. We apply Proteinortho to compute orthologous proteins in the complete set of all 717 eubacterial genomes available at NCBI at the beginning of 2009. We identified thirty proteins present in 99% of all bacterial proteomes. Conclusions Proteinortho significantly reduces the required amount of memory for orthology analysis compared to existing tools, allowing such computations to be performed on off-the-shelf hardware. PMID:21526987

  12. Creating an effort tracking tool to improve therapeutic cancer clinical trials workload management and budgeting.

    PubMed

    James, Pam; Bebee, Patty; Beekman, Linda; Browning, David; Innes, Mathew; Kain, Jeannie; Royce-Westcott, Theresa; Waldinger, Marcy

    2011-11-01

    Quantifying data management and regulatory workload for clinical research is a difficult task that would benefit from a robust tool to assess and allocate effort. As in most clinical research environments, The University of Michigan Comprehensive Cancer Center (UMCCC) Clinical Trials Office (CTO) struggled to effectively allocate data management and regulatory time with frequently inaccurate estimates of how much time was required to complete the specific tasks performed by each role. In a dynamic clinical research environment in which volume and intensity of work ebbs and flows, determining requisite effort to meet study objectives was challenging. In addition, a data-driven understanding of how much staff time was required to complete a clinical trial was desired to ensure accurate trial budget development and effective cost recovery. Accordingly, the UMCCC CTO developed and implemented a Web-based effort-tracking application with the goal of determining the true costs of data management and regulatory staff effort in clinical trials. This tool was developed, implemented, and refined over a 3-year period. This article describes the process improvement and subsequent leveling of workload within data management and regulatory that enhanced the efficiency of UMCCC's clinical trials operation.

  13. Dashboard systems: implementing pharmacometrics from bench to bedside.

    PubMed

    Mould, Diane R; Upton, Richard N; Wojciechowski, Jessica

    2014-09-01

    In recent years, there has been increasing interest in the development of medical decision-support tools, including dashboard systems. Dashboard systems are software packages that integrate information and calculations about therapeutics from multiple components into a single interface for use in the clinical environment. Given the high cost of medical care, and the increasing need to demonstrate positive clinical outcomes for reimbursement, dashboard systems may become an important tool for improving patient outcome, improving clinical efficiency and containing healthcare costs. Similarly the costs associated with drug development are also rising. The use of model-based drug development (MBDD) has been proposed as a tool to streamline this process, facilitating the selection of appropriate doses and making informed go/no-go decisions. However, complete implementation of MBDD has not always been successful owing to a variety of factors, including the resources required to provide timely modeling and simulation updates. The application of dashboard systems in drug development reduces the resource requirement and may expedite updating models as new data are collected, allowing modeling results to be available in a timely fashion. In this paper, we present some background information on dashboard systems and propose the use of these systems both in the clinic and during drug development.

  14. SU-E-T-211: Comparison of Seven New TrueBeam Linacs with Enhanced Beam Data Conformance Using a Beam Comparison Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grzetic, S; Hessler, J; Gupta, N

    2015-06-15

    Purpose: To develop an independent software tool to assist in commissioning linacs with enhanced beam conformance, as well as perform ongoing QA for dosimetrically equivalent linacs. Methods: Linac manufacturers offer enhanced beam conformance as an option to allow for clinics to complete commissioning efficiently, as well as implement dosimetrically equivalent linacs. The specification for enhanced conformance includes PDD as well as profiles within 80% FWHM. Recently, we commissioned seven Varian TrueBeam linacs with enhanced beam conformance. We developed a software tool in Visual Basic to allow us to load the reference beam data and compare our beam data during commissioningmore » to evaluate enhanced beam conformance. This tool also allowed us to upload our beam data used for commissioning our dosimetrically equivalent beam models to compare and tweak each of our linac beams to match our modelled data in Varian’s Eclipse TPS. This tool will also be used during annual QA of the linacs to compare our beam data to our baseline data, as required by TG-142. Results: Our software tool was used to check beam conformance for seven TrueBeam linacs that we commissioned in the past six months. Using our tool we found that the factory conformed linacs showed up to 3.82% difference in their beam profile data upon installation. Using our beam comparison tool, we were able to adjust the energy and profiles of our beams to accomplish a better than 1.00% point by point data conformance. Conclusion: The availability of quantitative comparison tools is essential to accept and commission linacs with enhanced beam conformance, as well as to beam match multiple linacs. We further intend to use the same tool to ensure our beam data conforms to the commissioning beam data during our annual QA in keeping with the requirements of TG-142.« less

  15. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  16. Online tools for individuals with depression and neurologic conditions: A scoping review.

    PubMed

    Lukmanji, Sara; Pham, Tram; Blaikie, Laura; Clark, Callie; Jetté, Nathalie; Wiebe, Samuel; Bulloch, Andrew; Holroyd-Leduc, Jayna; Macrodimitris, Sophia; Mackie, Aaron; Patten, Scott B

    2017-08-01

    Patients with neurologic conditions commonly have depression. Online tools have the potential to improve outcomes in these patients in an efficient and accessible manner. We aimed to identify evidence-informed online tools for patients with comorbid neurologic conditions and depression. A scoping review of online tools (free, publicly available, and not requiring a facilitator) for patients with depression and epilepsy, Parkinson disease (PD), multiple sclerosis (MS), traumatic brain injury (TBI), or migraine was conducted. MEDLINE, EMBASE, PsycINFO, Cochrane Database of Systematic Reviews, and Cochrane CENTRAL Register of Controlled Trials were searched from database inception to January 2017 for all 5 neurologic conditions. Gray literature using Google and Google Scholar as well as app stores for both Android and Apple devices were searched. Self-management or self-efficacy online tools were not included unless they were specifically targeted at depression and one of the neurologic conditions and met the other eligibility criteria. Only 4 online tools were identified. Of these 4 tools, 2 were web-based self-management programs for patients with migraine or MS and depression. The other 2 were mobile apps for patients with PD or TBI and depression. No online tools were found for epilepsy. There are limited depression tools for people with neurologic conditions that are evidence-informed, publicly available, and free. Future research should focus on the development of high-quality, evidence-based online tools targeted at neurologic patients.

  17. Initial Investigations of Controller Tools and Procedures for Schedule-Based Arrival Operations with Mixed Flight-Deck Interval Management Equipage

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Cabrall, Christopher; Kupfer, Michael; Omar, Faisal G.; Prevot, Thomas

    2012-01-01

    NASA?s Air Traffic Management Demonstration-1 (ATD-1) is a multi-year effort to demonstrate high-throughput, fuel-efficient arrivals at a major U.S. airport using NASA-developed scheduling automation, controller decision-support tools, and ADS-B-enabled Flight-Deck Interval Management (FIM) avionics. First-year accomplishments include the development of a concept of operations for managing scheduled arrivals flying Optimized Profile Descents with equipped aircraft conducting FIM operations, and the integration of laboratory prototypes of the core ATD-1 technologies. Following each integration phase, a human-in-the-loop simulation was conducted to evaluate and refine controller tools, procedures, and clearance phraseology. From a ground-side perspective, the results indicate the concept is viable and the operations are safe and acceptable. Additional training is required for smooth operations that yield notable benefits, particularly in the areas of FIM operations and clearance phraseology.

  18. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  19. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  20. The Simple Video Coder: A free tool for efficiently coding social video data.

    PubMed

    Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C

    2017-08-01

    Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.

  1. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  2. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  3. Demonstration of Lightweight Engineering Solutions for a Low-Cost Safe Explosive Ordnance Destruct Tool

    DTIC Science & Technology

    2007-12-01

    Staff) and Mr. Doug Learned ( Intercity Manufacturing), whose efficiency and expertise was vital in manufacturing the parts required for our tests...detonation products caused by the hollow cavity. Upon initiation of a hollow lined charge, the resulting high pressure shock wave travels outward...5.6 km/s for the brass encased charge at 2 and 3 CD. This indicates that the jet must be traveling at velocities greater than the estimates, which

  4. Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.

    2016-01-01

    A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.

  5. The value of job analysis, job description and performance.

    PubMed

    Wolfe, M N; Coggins, S

    1997-01-01

    All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.

  6. [Enriching the diagnosis announcement system with the coordination passport].

    PubMed

    Bertrand, Nathalie

    2016-05-01

    The personalised care plan of a person with cancer requires proper coordination between the various professionals involved in their care at the different stages of their illness. In order to organise this coordination efficiently, for the patient as well as for the health professionals, an oncology hospital team has developed a practical and modular tool. The coordination passport enriches the diagnosis announcement system used in the hospital. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  7. Application of the Booth-Kautzmann method for the determination of N-2 packing leakage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burkhart, D.M.; Milton, J.W.; Fawcett, S.T.

    1995-06-01

    To accurately determine turbine cycle heat rate, leakage past the N-2 steam seal packing must be determined on turbines with both HP and IP turbines contained within a common high pressure casing. N-2 packing leakage can be determined by the Booth-Kautzmann Method with instrumentation commonly used to determine the HP and IP turbine efficiency. The only additional requirements are changes to the main steam and/or hot reheat steam conditions. This paper discusses the actual test results using the Booth-Kautzmann test procedure on three natural gas fired units. The test results demonstrate the added advantage of having at least three N-2more » test runs, stability requirements for repeatable test runs and test procedures used to determine leakage results. Discussion of the sensitivity of the assumed N-2 enthalpy are also addressed. Utilizing Martins Formula with a series of N-2 Leakage test runs is shown to be a leakage prediction tool and a packing clearance approximation tool. It is concluded that the Booth-Kautzmann Method for determination of N-2 packing leakage should be utilized whenever HP and Ip turbine efficiency is determined. The two or three additional hours invested in the test runs is well worth the information gained on the performance of the N-2 packing.« less

  8. A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology

    NASA Technical Reports Server (NTRS)

    Hoy, Scott D.; Figueiredo, Marco A.

    2006-01-01

    Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:

  9. Challenges to Integrating Geographically-Dispersed Data and Expertise at U.S. Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Murray, T. L.; Ewert, J. W.

    2010-12-01

    During the past 10 years the data and information available to volcano observatories to assess hazards and forecast activity has grown dramatically, a trend that will likely continue. Similarly, the ability of observatories to draw upon external specialists who can provide needed expertise is also increasing. Though technology easily provides the ability to move large amounts of information to the observatory, the challenge remains to efficiently and quickly integrate useful information and expertise into the decision-making process. The problem is further exacerbated by the use of new research techniques during times of heightened activity. Eruptive periods typically accelerate research into volcanic processes as scientists use the opportunity to test new hypotheses and develop new tools. Such experimental methods can be extremely insightful, but may be less easily integrated into the normal data streams that inform decisions. Similarly, there is an increased need for collaborative tools that allow efficient and effective communication between the observatory and external experts. Observatories will continue to be the central focus for integrating information, assessing hazards, and communicating with the public, but will increasingly draw on experts at other observatories, government agencies, academia and even the private sector, both foreign and domestic, to provide analysis and assistance. Fostering efficient communication among such a diverse and geographically dispersed group is a challenge. Addressing these challenges is one of the goals of the U.S. National Volcano Early Warning System, falling under the effort to improve interoperability among the five U.S. volcano observatories and their collaborators. In addition to providing the mechanisms to handle the flow of data, efforts will be directed at simplifying - though retaining the required nuance - information and merging data streams while developing tools that enable observatory staff to quickly integrate the data into the decision-making process. Also, advances in the use of collaborative tools and organizational structure will be required if observatories are to tap into the intellectual resources throughout the volcanological community. The last 10 years saw a continuing explosion in the quantity and quality of data and expertise available to address volcano hazards and volcanic activity; the challenge over the next 10 years will be for us to make the best use of it.

  10. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  11. On the use of satellite data to implement a parsimonious ecohydrological model in the upper Ewaso Ngiro river basin

    NASA Astrophysics Data System (ADS)

    Ruiz-Pérez, G.

    2015-12-01

    Drylands are extensive, covering 30% of the Earth's land surface and 50% of Africa. Projections of the IPCC (Intergovernmental Panel on Climate Change, 2007) indicate that the extent of these regions have high probability to increase with a considerable additional impact on water resources, which should be taken into account by water management plans. In these water-controlled areas, vegetation plays a key role in the water cycle. Ecohydrological models provide a tool to investigate the relationships between vegetation and water resources. However, studies in Africa often face the problem that many ecohydrological models have quite extensive parametrical requirements, while available data are scarce. Therefore, there is a need for assessments using models whose requirements match the data availability. In that context, parsimonious models, together with available remote sensing information, can be valuable tools for ecohydrological studies. For this reason, we have focused on the use of a parsimonious model based on the amount of photosynthetically active radiation absorbed by green vegetation (APAR) and the Light Use Efficiency index (the efficiency by which that radiation is converted to plant biomass increment) in order to compute the gross primary production (GPP).This model has been calibrated using only remote sensing data (particularly, NDVI data from Modis products) in order to explore the potential of satellite information in implementing a simple distributed model. The model has been subsequently validated against stream flow data with the aim to define a tool able to account for landuse characteristics in describing water budget. Results are promising for studies aimed at the description of the consequences of ongoing land use changes on water resources.

  12. An opportunity cost approach to sample size calculation in cost-effectiveness analysis.

    PubMed

    Gafni, A; Walter, S D; Birch, S; Sendi, P

    2008-01-01

    The inclusion of economic evaluations as part of clinical trials has led to concerns about the adequacy of trial sample size to support such analysis. The analytical tool of cost-effectiveness analysis is the incremental cost-effectiveness ratio (ICER), which is compared with a threshold value (lambda) as a method to determine the efficiency of a health-care intervention. Accordingly, many of the methods suggested to calculating the sample size requirements for the economic component of clinical trials are based on the properties of the ICER. However, use of the ICER and a threshold value as a basis for determining efficiency has been shown to be inconsistent with the economic concept of opportunity cost. As a result, the validity of the ICER-based approaches to sample size calculations can be challenged. Alternative methods for determining improvements in efficiency have been presented in the literature that does not depend upon ICER values. In this paper, we develop an opportunity cost approach to calculating sample size for economic evaluations alongside clinical trials, and illustrate the approach using a numerical example. We compare the sample size requirement of the opportunity cost method with the ICER threshold method. In general, either method may yield the larger required sample size. However, the opportunity cost approach, although simple to use, has additional data requirements. We believe that the additional data requirements represent a small price to pay for being able to perform an analysis consistent with both concept of opportunity cost and the problem faced by decision makers. Copyright (c) 2007 John Wiley & Sons, Ltd.

  13. COMMUNICATING THE PARAMETER UNCERTAINTY IN THE IQWIG EFFICIENCY FRONTIER TO DECISION-MAKERS

    PubMed Central

    Stollenwerk, Björn; Lhachimi, Stefan K; Briggs, Andrew; Fenwick, Elisabeth; Caro, Jaime J; Siebert, Uwe; Danner, Marion; Gerber-Grote, Andreas

    2015-01-01

    The Institute for Quality and Efficiency in Health Care (IQWiG) developed—in a consultation process with an international expert panel—the efficiency frontier (EF) approach to satisfy a range of legal requirements for economic evaluation in Germany's statutory health insurance system. The EF approach is distinctly different from other health economic approaches. Here, we evaluate established tools for assessing and communicating parameter uncertainty in terms of their applicability to the EF approach. Among these are tools that perform the following: (i) graphically display overall uncertainty within the IQWiG EF (scatter plots, confidence bands, and contour plots) and (ii) communicate the uncertainty around the reimbursable price. We found that, within the EF approach, most established plots were not always easy to interpret. Hence, we propose the use of price reimbursement acceptability curves—a modification of the well-known cost-effectiveness acceptability curves. Furthermore, it emerges that the net monetary benefit allows an intuitive interpretation of parameter uncertainty within the EF approach. This research closes a gap for handling uncertainty in the economic evaluation approach of the IQWiG methods when using the EF. However, the precise consequences of uncertainty when determining prices are yet to be defined. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24590819

  14. 3Drefine: an interactive web server for efficient protein structure refinement

    PubMed Central

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371

  15. FORAGES AND PASTURES SYMPOSIUM: Improving efficiency of production in pasture- and range-based beef and dairy systems.

    PubMed

    Mulliniks, J T; Rius, A G; Edwards, M A; Edwards, S R; Hobbs, J D; Nave, R L G

    2015-06-01

    Despite overall increased production in the last century, it is critical that grazing production systems focus on improving beef and dairy efficiency to meet current and future global food demands. For livestock producers, production efficiency is essential to maintain long-term profitability and sustainability. This continued viability of production systems using pasture- and range-based grazing systems requires more rapid adoption of innovative management practices and selection tools that increase profitability by optimizing grazing management and increasing reproductive performance. Understanding the genetic variation in cow herds will provide the ability to select cows that require less energy for maintenance, which can potentially reduce total energy utilization or energy required for production, consequently improving production efficiency and profitability. In the United States, pasture- and range-based grazing systems vary tremendously across various unique environments that differ in climate, topography, and forage production. This variation in environmental conditions contributes to the challenges of developing or targeting specific genetic components and grazing systems that lead to increased production efficiency. However, across these various environments and grazing management systems, grazable forage remains the least expensive nutrient source to maintain productivity of the cow herd. Beef and dairy cattle can capitalize on their ability to utilize these feed resources that are not usable for other production industries. Therefore, lower-cost alternatives to feeding harvested and stored feedstuffs have the opportunity to provide to livestock producers a sustainable and efficient forage production system. However, increasing production efficiency within a given production environment would vary according to genetic potential (i.e., growth and milk potential), how that genetic potential fits the respective production environment, and how the grazing management fits within those genetic parameters. Therefore, matching cow type or genetic potential to the production environment is and will be more important as cost of production increases.

  16. Radiology metrics for safe use and regulatory compliance with CT imaging

    NASA Astrophysics Data System (ADS)

    Paden, Robert; Pavlicek, William

    2018-03-01

    The MACRA Act creates a Merit-Based Payment System, with monitoring patient exposure from CT providing one possible quality metric for meeting merit requirements. Quality metrics are also required by The Joint Commission, ACR, and CMS as facilities are tasked to perform reviews of CT irradiation events outside of expected ranges, review protocols for appropriateness, and validate parameters for low dose lung cancer screening. In order to efficiently collect and analyze irradiation events and associated DICOM tags, all clinical CT devices were DICOM connected to a parser which extracted dose related information for storage into a database. Dose data from every exam is compared to the appropriate external standard exam type. AAPM recommended CTDIvol values for head and torso, adult and pediatrics, coronary and perfusion exams are used for this study. CT doses outside the expected range were automatically formatted into a report for analysis and review documentation. CT Technologist textual content, the reason for proceeding with an irradiation above the recommended threshold, is captured for inclusion in the follow up reviews by physics staff. The use of a knowledge based approach in labeling individual protocol and device settings is a practical solution resulting in efficiency of analysis and review. Manual methods would require approximately 150 person-hours for our facility, exclusive of travel time and independent of device availability. An efficiency of 89% time savings occurs through use of this informatics tool including a low dose CT comparison review and low dose lung cancer screening requirements set forth by CMS.

  17. Developing a Collection of Composable Data Translation Software Units to Improve Efficiency and Reproducibility in Ecohydrologic Modeling Workflows

    NASA Astrophysics Data System (ADS)

    Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.

    2017-12-01

    Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of bytes to be read as the number of computations it performs. When several components' executions are coordinated the overall arithmetic intensity increases, leading to increased efficiency.

  18. Exploring the Role of Social Memory of Floods for Designing Flood Early Warning Operations

    NASA Astrophysics Data System (ADS)

    Girons Lopez, Marc; Di Baldassarre, Giuliano; Grabs, Thomas; Halldin, Sven; Seibert, Jan

    2016-04-01

    Early warning systems are an important tool for natural disaster mitigation practices, especially for flooding events. Warnings rely on near-future forecasts to provide time to take preventive actions before a flood occurs, thus reducing potential losses. However, on top of the technical capacities, successful warnings require an efficient coordination and communication among a range of different actors and stakeholders. The complexity of integrating the technical and social spheres of warning systems has, however, resulted in system designs neglecting a number of important aspects such as social awareness of floods thus leading to suboptimal results. A better understanding of the interactions and feedbacks among the different elements of early warning systems is therefore needed to improve their efficiency and therefore social resilience. When designing an early warning system two important decisions need to be made regarding (i) the hazard magnitude at and from which a warning should be issued and (ii) the degree of confidence required for issuing a warning. The first decision is usually taken based on the social vulnerability and climatic variability while the second one is related to the performance (i.e. accuracy) of the forecasting tools. Consequently, by estimating the vulnerability and the accuracy of the forecasts, these two variables can be optimized to minimize the costs and losses. Important parameters with a strong influence on the efficiency of warning systems such as social awareness are however not considered in their design. In this study we present a theoretical exploration of the impact of social awareness on the design of early warning systems. For this purpose we use a definition of social memory of flood events as a proxy for flood risk awareness and test its effect on the optimization of the warning system design variables. Understanding the impact of social awareness on warning system design is important to make more robust warnings that can better adapt to different social settings and more efficiently reduce vulnerability.

  19. CFD Fuel Slosh Modeling of Fluid-Structure Interaction in Spacecraft Propellant Tanks with Diaphragms

    NASA Technical Reports Server (NTRS)

    Sances, Dillon J.; Gangadharan, Sathya N.; Sudermann, James E.; Marsell, Brandon

    2010-01-01

    Liquid sloshing within spacecraft propellant tanks causes rapid energy dissipation at resonant modes, which can result in attitude destabilization of the vehicle. Identifying resonant slosh modes currently requires experimental testing and mechanical pendulum analogs to characterize the slosh dynamics. Computational Fluid Dynamics (CFD) techniques have recently been validated as an effective tool for simulating fuel slosh within free-surface propellant tanks. Propellant tanks often incorporate an internal flexible diaphragm to separate ullage and propellant which increases modeling complexity. A coupled fluid-structure CFD model is required to capture the damping effects of a flexible diaphragm on the propellant. ANSYS multidisciplinary engineering software employs a coupled solver for analyzing two-way Fluid Structure Interaction (FSI) cases such as the diaphragm propellant tank system. Slosh models generated by ANSYS software are validated by experimental lateral slosh test results. Accurate data correlation would produce an innovative technique for modeling fuel slosh within diaphragm tanks and provide an accurate and efficient tool for identifying resonant modes and the slosh dynamic response.

  20. Sterilization of medical equipment and contaminated articles by making use of a resistive barrier discharge

    NASA Astrophysics Data System (ADS)

    Uhm, Han S.; Kang, Jung G.; Choi, Eun H.; Cho, Guang S.

    2012-08-01

    Presented here is an apparatus consisting of an atmospheric resistive-barrier discharge for the sterilization of medical tools wrapped in typical hospital cloths, for the sterilization of manufactured drugs in typical packaging materials, and for the sterilization of biologically-contaminated articles. The sterilization apparatus consists of layers of the resistive-barrier discharge device operating at room temperature, a sterilization chamber, and an ozone destruction device. An electrical discharge in the resistive-barrier discharge system generates an atmospheric plasma in oxygen gas, generating ozone, which in turn efficiently sterilizes medical tools and biologically contaminated articles at room temperature. A sterilization experiment was carried out at an apparatus volume of 100 liters, with a sterilization chamber volume of 60 liters, and a discharge device volume of 40 liters. The sterilization in this experiment required 60 W of power for 5 hours of residence time. For a given sterilization time, the required electrical power was proportional to the apparatus volume. Ozone in the sterilization chamber was destroyed safely after sterilization.

  1. Using naturalistic driving films as a design tool for investigating driver requirements in HMI design for ADAS.

    PubMed

    Wang, Minjuan; Sun, Dong; Chen, Fang

    2012-01-01

    In recent years, there are many naturalistic driving projects have been conducted, such as the 100-Car Project (Naturalistic Driving study in United State), EuroFOT(European Large-Scale Field Operational Tests on Vehicle Systems), SeMi- FOT(Sweden Michigan Naturalistic Field Operational Test and etc. However, those valuable naturalistic driving data hasn't been applied into Human-machine Interaction (HMI) design for Advanced Driver Assistance Systems (ADAS), a good HMI design for ADAS requires a deep understanding of drive environment and the interactions between the driving car and other road users in different situations. The results demonstrated the benefits of using naturalistic driving films as a mean for enhancing focus group discussion for better understanding driver's needs and traffic environment constraints. It provided an efficient tool for designers to have inside knowledge about drive and the needs for information presentation; The recommendations for how to apply this method is discussed in the paper.

  2. Data base architecture for instrument characteristics critical to spacecraft conceptual design

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Allen, Cheryl L.

    1990-01-01

    Spacecraft designs are driven by the payloads and mission requirements that they support. Many of the payload characteristics, such as mass, power requirements, communication requirements, moving parts, and so forth directly affect the choices for the spacecraft structural configuration and its subsystem design and component selection. The conceptual design process, which translates mission requirements into early spacecraft concepts, must be tolerant of frequent changes in the payload complement and resource requirements. A computer data base was designed and implemented for the purposes of containing the payload characteristics pertinent for spacecraft conceptual design, tracking the evolution of these payloads over time, and enabling the integration of the payload data with engineering analysis programs for improving the efficiency in producing spacecraft designs. In-house tools were used for constructing the data base and for performing the actual integration with an existing program for optimizing payload mass locations on the spacecraft.

  3. Philosophies Applied in the Selection of Space Suit Joint Range of Motion Requirements

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsway; Ross, Amy; Matty, Jennifer

    2009-01-01

    Space suits are the most important tool for astronauts working in harsh space and planetary environments; suits keep crewmembers alive and allow them to perform exploration, construction, and scientific tasks on a routine basis over a period of several months. The efficiency with which the tasks are performed is largely dictated by the mobility features of the space suit. For previous space suit development programs, the mobility requirements were written as pure functional mobility requirements that did not separate joint ranges of motion from the joint torques. The Constellation Space Suit Element has the goal to make more quantitative mobility requirements that focused on the individual components of mobility to enable future suit designers to build and test systems more effectively. This paper details the test planning and selection process for the Constellation space suit pressure garment range of motion requirements.

  4. Environmental DNA from Residual Saliva for Efficient Noninvasive Genetic Monitoring of Brown Bears (Ursus arctos)

    PubMed Central

    Wheat, Rachel E.; Allen, Jennifer M.; Miller, Sophie D. L.; Wilmers, Christopher C.; Levi, Taal

    2016-01-01

    Noninvasive genetic sampling is an important tool in wildlife ecology and management, typically relying on hair snaring or scat sampling techniques, but hair snaring is labor and cost intensive, and scats yield relatively low quality DNA. New approaches utilizing environmental DNA (eDNA) may provide supplementary, cost-effective tools for noninvasive genetic sampling. We tested whether eDNA from residual saliva on partially-consumed Pacific salmon (Oncorhynchus spp.) carcasses might yield suitable DNA quality for noninvasive monitoring of brown bears (Ursus arctos). We compared the efficiency of monitoring brown bear populations using both fecal DNA and salivary eDNA collected from partially-consumed salmon carcasses in Southeast Alaska. We swabbed a range of tissue types from 156 partially-consumed salmon carcasses from a midseason run of lakeshore-spawning sockeye (O. nerka) and a late season run of stream-spawning chum (O. keta) salmon in 2014. We also swabbed a total of 272 scats from the same locations. Saliva swabs collected from the braincases of salmon had the best amplification rate, followed by swabs taken from individual bite holes. Saliva collected from salmon carcasses identified unique individuals more quickly and required much less labor to locate than scat samples. Salmon carcass swabbing is a promising method to aid in efficient and affordable monitoring of bear populations, and suggests that the swabbing of food remains or consumed baits from other animals may be an additional cost-effective and valuable tool in the study of the ecology and population biology of many elusive and/or wide-ranging species. PMID:27828988

  5. Software Requirements Analysis as Fault Predictor

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Waiting until the integration and system test phase to discover errors leads to more costly rework than resolving those same errors earlier in the lifecycle. Costs increase even more significantly once a software system has become operational. WE can assess the quality of system requirements, but do little to correlate this information either to system assurance activities or long-term reliability projections - both of which remain unclear and anecdotal. Extending earlier work on requirements accomplished by the ARM tool, measuring requirements quality information against code complexity and test data for the same system may be used to predict specific software modules containing high impact or deeply embedded faults now escaping in operational systems. Such knowledge would lead to more effective and efficient test programs. It may enable insight into whether a program should be maintained or started over.

  6. biobambam: tools for read pair collation based algorithms on BAM files

    PubMed Central

    2014-01-01

    Background Sequence alignment data is often ordered by coordinate (id of the reference sequence plus position on the sequence where the fragment was mapped) when stored in BAM files, as this simplifies the extraction of variants between the mapped data and the reference or of variants within the mapped data. In this order paired reads are usually separated in the file, which complicates some other applications like duplicate marking or conversion to the FastQ format which require to access the full information of the pairs. Results In this paper we introduce biobambam, a set of tools based on the efficient collation of alignments in BAM files by read name. The employed collation algorithm avoids time and space consuming sorting of alignments by read name where this is possible without using more than a specified amount of main memory. Using this algorithm tasks like duplicate marking in BAM files and conversion of BAM files to the FastQ format can be performed very efficiently with limited resources. We also make the collation algorithm available in the form of an API for other projects. This API is part of the libmaus package. Conclusions In comparison with previous approaches to problems involving the collation of alignments by read name like the BAM to FastQ or duplication marking utilities our approach can often perform an equivalent task more efficiently in terms of the required main memory and run-time. Our BAM to FastQ conversion is faster than all widely known alternatives including Picard and bamUtil. Our duplicate marking is about as fast as the closest competitor bamUtil for small data sets and faster than all known alternatives on large and complex data sets.

  7. Simulating long-term effectiveness and efficiency of management scenarios for an invasive grass

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Holcombe, Tracy R.; Cullinane Thomas, Catherine; Frid, Leonardo; Olsson, Aaryn D.

    2015-01-01

    Resource managers are often faced with trade-offs in allocating limited resources to manage plant invasions. These decisions must often be made with uncertainty about the location of infestations, their rate of spread and effectiveness of management actions. Landscape level simulation tools such as state-and-transition simulation models (STSMs) can be used to evaluate the potential long term consequences of alternative management strategies and help identify those strategies that make efficient use of resources. We analyzed alternative management scenarios for African buffelgrass (Pennisetum ciliare syn. Cenchrus ciliaris) at Ironwood Forest National Monument, Arizona using a spatially explicit STSM implemented in the Tool for Exploratory Landscape Scenario Analyses (TELSA). Buffelgrass is an invasive grass that is spreading rapidly in the Sonoran Desert, affecting multiple habitats and jurisdictions. This invasion is creating a novel fire risk and transforming natural ecosystems. The model used in this application incorporates buffelgrass dispersal and establishment and management actions and effectiveness including inventory, treatment and post-treatment maintenance. We simulated 11 alternative scenarios developed in consultation with buffelgrass managers and other stakeholders. The scenarios vary according to the total budget allocated for management and the allocation of that budget between different kinds of management actions. Scenario results suggest that to achieve an actual reduction and stabilization of buffelgrass populations, management unconstrained by fiscal restrictions and across all jurisdictions and private lands is required; without broad and aggressive management, buffelgrass populations are expected to increase over time. However, results also suggest that large upfront investments can achieve control results that require relatively minimal spending in the future. Investing the necessary funds upfront to control the invasion results in the most efficient use of resources to achieve lowest invaded acreage in the long-term.

  8. On extending parallelism to serial simulators

    NASA Technical Reports Server (NTRS)

    Nicol, David; Heidelberger, Philip

    1994-01-01

    This paper describes an approach to discrete event simulation modeling that appears to be effective for developing portable and efficient parallel execution of models of large distributed systems and communication networks. In this approach, the modeler develops submodels using an existing sequential simulation modeling tool, using the full expressive power of the tool. A set of modeling language extensions permit automatically synchronized communication between submodels; however, the automation requires that any such communication must take a nonzero amount off simulation time. Within this modeling paradigm, a variety of conservative synchronization protocols can transparently support conservative execution of submodels on potentially different processors. A specific implementation of this approach, U.P.S. (Utilitarian Parallel Simulator), is described, along with performance results on the Intel Paragon.

  9. Social Media As a Leadership Tool for Pharmacists

    PubMed Central

    Toney, Blake; Goff, Debra A.; Weber, Robert J.

    2015-01-01

    The profession of pharmacy is currently experiencing transformational change in health system practice models with pharmacists’ provider status. Gaining buy-in and support of stakeholders in medicine, nursing, and other advocates for patient care is critical. To this end, building momentum to advance the profession will require experimentation with and utilization of more efficient ways to disseminate relevant information. Traditional methods to communicate can be inefficient and painstakingly slow. Health care providers are turning to social media to network, connect, engage, educate, and learn. Pharmacy leaders can use social media as an additional tool in the leadership toolkit. This article of the Director’s Forum shows how social media can assist pharmacy leaders in further developing patient-centered pharmacy services. PMID:26448676

  10. A Global Review of Incentive Programs to Accelerate Energy-Efficient Appliances and Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de la Rue du Can, Stephane; Phadke, Amol; Leventis, Greg

    Incentive programs are an essential policy tool to move the market toward energy-efficient products. They offer a favorable complement to mandatory standards and labeling policies by accelerating the market penetration of energy-efficient products above equipment standard requirements and by preparing the market for increased future mandatory requirements. They sway purchase decisions and in some cases production decisions and retail stocking decisions toward energy-efficient products. Incentive programs are structured according to their regulatory environment, the way they are financed, by how the incentive is targeted, and by who administers them. This report categorizes the main elements of incentive programs, using casemore » studies from the Major Economies Forum to illustrate their characteristics. To inform future policy and program design, it seeks to recognize design advantages and disadvantages through a qualitative overview of the variety of programs in use around the globe. Examples range from rebate programs administered by utilities under an Energy-Efficiency Resource Standards (EERS) regulatory framework (California, USA) to the distribution of Eco-Points that reward customers for buying efficient appliances under a government recovery program (Japan). We found that evaluations have demonstrated that financial incentives programs have greater impact when they target highly efficient technologies that have a small market share. We also found that the benefits and drawbacks of different program design aspects depend on the market barriers addressed, the target equipment, and the local market context and that no program design surpasses the others. The key to successful program design and implementation is a thorough understanding of the market and effective identification of the most important local factors hindering the penetration of energy-efficient technologies.« less

  11. Adapting practice-based intervention research to electronic environments: opportunities and complexities at two institutions.

    PubMed

    Stille, Christopher J; Lockhart, Steven A; Maertens, Julie A; Madden, Christi A; Darden, Paul M

    2015-01-01

    Primary care practice-based research has become more complex with increased use of electronic health records (EHRs). Little has been reported about changes in study planning and execution that are required as practices change from paper-based to electronic-based environments. We describe the evolution of a pediatric practice-based intervention study as it was adapted for use in the electronic environment, to enable other practice-based researchers to plan efficient, effective studies. We adapted a paper-based pediatric office-level intervention to enhance parent-provider communication about subspecialty referrals for use in two practice-based research networks (PBRNs) with partially and fully electronic environments. We documented the process of adaptation and its effect on study feasibility and efficiency, resource use, and administrative and regulatory complexities, as the study was implemented in the two networks. Considerable time and money was required to adapt the paper-based study to the electronic environment, requiring extra meetings with institutional EHR-, regulatory-, and administrative teams, and increased practice training. Institutional unfamiliarity with using EHRs in practice-based research, and the consequent need to develop new policies, were major contributors to delays. Adapting intervention tools to the EHR and minimizing practice disruptions was challenging, but resulted in several efficiencies as compared with a paper-based project. In particular, recruitment and tracking of subjects and data collection were easier and more efficient. Practice-based intervention research in an electronic environment adds considerable cost and time at the outset of a study, especially for centers unfamiliar with such research. Efficiencies generated have the potential of easing the work of study enrollment, subject tracking, and data collection.

  12. A novel tool to standardize rheology testing of molten polymers for pharmaceutical applications.

    PubMed

    Treffer, Daniel; Troiss, Alexander; Khinast, Johannes

    2015-11-10

    Melt rheology provides information about material properties that are of great importance for equipment design and simulations, especially for novel pharmaceutical manufacturing operations, including extrusion, injection molding or 3d printing. To that end, homogeneous samples must be prepared, most commonly via compression or injection molding, both of which require costly equipment and might not be applicable for shear- and heat-sensitive pharmaceutical materials. Our study introduces a novel vacuum compression molding (VCM) tool for simple preparation of thermoplastic specimens using standard laboratory equipment: a hot plate and a vacuum source. Sticking is eliminated by applying polytetrafluoroethylene (PTFE) coated separation foils. The evacuation of the tool leads to compression of the sample chamber, which is cost-efficient compared to conventional methods, such as compression molding or injection molding that require special equipment. In addition, this compact design reduces the preparation time and the heat load. The VCM tool was used to prepare samples for a rheological study of three pharmaceutical polymers (Soluplus(®), Eudragit(®)E, EVA Rowalit(®) 300-1/28). The prepared samples were without any air inclusions or voids, and the measurements had a high reproducibility. All relative standard deviations were below 3%. The obtained data were fitted to the Carreau-Yasuda model and time-temperature superposition was applied. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. The JASMIN Analysis Platform - bridging the gap between traditional climate data practicies and data-centric analysis paradigms

    NASA Astrophysics Data System (ADS)

    Pascoe, Stephen; Iwi, Alan; kershaw, philip; Stephens, Ag; Lawrence, Bryan

    2014-05-01

    The advent of large-scale data and the consequential analysis problems have led to two new challenges for the research community: how to share such data to get the maximum value and how to carry out efficient analysis. Solving both challenges require a form of parallelisation: the first is social parallelisation (involving trust and information sharing), the second data parallelisation (involving new algorithms and tools). The JASMIN infrastructure supports both kinds of parallelism by providing a multi-tennent environment with petabyte-scale storage, VM provisioning and batch cluster facilities. The JASMIN Analysis Platform (JAP) is an analysis software layer for JASMIN which emphasises ease of transition from a researcher's local environment to JASMIN. JAP brings together tools traditionally used by multiple communities and configures them to work together, enabling users to move analysis from their local environment to JASMIN without rewriting code. JAP also provides facilities to exploit JASMIN's parallel capabilities whilst maintaining their familiar analysis environment where ever possible. Modern opensource analysis tools typically have multiple dependent packages, increasing the installation burden on system administrators. When you consider a suite of tools, often with both common and conflicting dependencies, analysis pipelines can become locked to a particular installation simply because of the effort required to reconstruct the dependency tree. JAP addresses this problem by providing a consistent suite of RPMs compatible with RedHat Enterprise Linux and CentOS 6.4. Researchers can install JAP locally, either as RPMs or through a pre-built VM image, giving them the confidence to know moving analysis to JASMIN will not disrupt their environment. Analysis parallelisation is in it's infancy in climate sciences, with few tools capable of exploiting any parallel environment beyond manual scripting of the use of multiple processors. JAP begins to bridge this gap through a veriety of higher-level tools for parallelisation and job scheduling such as IPython-parallel and MPI support for interactive analysis languages. We find that enabling even simple parallelisation of workflows, together with the state of the art I/O performance of JASMIN storage, provides many users with the large increases in efficiency they need to scale their analyses to conteporary data volumes and tackly new, previously inaccessible, problems.

  14. Comprehensive evaluation of contemporary assisted reproduction technology laboratory operations to determine staffing levels that promote patient safety and quality care.

    PubMed

    Alikani, Mina; Go, Kathryn J; McCaffrey, Caroline; McCulloh, David H

    2014-11-01

    To consider how staffing requirements have changed with evolving and increasingly more complex assisted reproduction technology (ART) laboratory practice. Analysis by four laboratory directors from three different ART programs of the level of complexity and time requirements for contemporary ART laboratory activities to determine adequate staffing levels. University-based and private ART programs. None. None. Human resource requirements for ART procedures. Both complexity and time required for completion of a contemporary ART cycle have increased significantly compared with the same requirements for the "traditional cycle" of the past. The latter required roughly 9 personnel hours, but a contemporary cycle can require up to 20 hours for completion. Consistent with this increase, a quantitative analysis shows that the number of embryologists required for safe and efficient operation of the ART laboratory has also increased. This number depends on not only the volume but also the types of procedures performed: the higher the number of complex procedures, the more personnel required. An interactive Personnel Calculator is introduced that can help determine staffing needs. The increased complexity of the contemporary ART laboratory requires a new look at the allocation of human resources. Our work provides laboratory directors with a practical, individualized tool to determine their staffing requirements with a view to increasing the safety and efficiency of operations. The work could serve as the basis for revision of the 2008 American Society for Reproductive Medicine (ASRM) staffing guidelines. Copyright © 2014 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  15. Data-driven Applications for the Sun-Earth System

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.

    2016-12-01

    Advances in observational and data mining techniques allow extracting information from the large volume of Sun-Earth observational data that can be assimilated into first principles physical models. However, equations governing Sun-Earth phenomena are typically nonlinear, complex, and high-dimensional. The high computational demand of solving the full governing equations over a large range of scales precludes the use of a variety of useful assimilative tools that rely on applied mathematical and statistical techniques for quantifying uncertainty and predictability. Effective use of such tools requires the development of computationally efficient methods to facilitate fusion of data with models. This presentation will provide an overview of various existing as well as newly developed data-driven techniques adopted from atmospheric and oceanic sciences that proved to be useful for space physics applications, such as computationally efficient implementation of Kalman Filter in radiation belts modeling, solar wind gap-filling by Singular Spectrum Analysis, and low-rank procedure for assimilation of low-altitude ionospheric magnetic perturbations into the Lyon-Fedder-Mobarry (LFM) global magnetospheric model. Reduced-order non-Markovian inverse modeling and novel data-adaptive decompositions of Sun-Earth datasets will be also demonstrated.

  16. Discrete event simulation for healthcare organizations: a tool for decision making.

    PubMed

    Hamrock, Eric; Paige, Kerrie; Parks, Jennifer; Scheulen, James; Levin, Scott

    2013-01-01

    Healthcare organizations face challenges in efficiently accommodating increased patient demand with limited resources and capacity. The modern reimbursement environment prioritizes the maximization of operational efficiency and the reduction of unnecessary costs (i.e., waste) while maintaining or improving quality. As healthcare organizations adapt, significant pressures are placed on leaders to make difficult operational and budgetary decisions. In lieu of hard data, decision makers often base these decisions on subjective information. Discrete event simulation (DES), a computerized method of imitating the operation of a real-world system (e.g., healthcare delivery facility) over time, can provide decision makers with an evidence-based tool to develop and objectively vet operational solutions prior to implementation. DES in healthcare commonly focuses on (1) improving patient flow, (2) managing bed capacity, (3) scheduling staff, (4) managing patient admission and scheduling procedures, and (5) using ancillary resources (e.g., labs, pharmacies). This article describes applicable scenarios, outlines DES concepts, and describes the steps required for development. An original DES model developed to examine crowding and patient flow for staffing decision making at an urban academic emergency department serves as a practical example.

  17. Ultrawidefield microscope for high-speed fluorescence imaging and targeted optogenetic stimulation.

    PubMed

    Werley, Christopher A; Chien, Miao-Ping; Cohen, Adam E

    2017-12-01

    The rapid increase in the number and quality of fluorescent reporters and optogenetic actuators has yielded a powerful set of tools for recording and controlling cellular state and function. To achieve the full benefit of these tools requires improved optical systems with high light collection efficiency, high spatial and temporal resolution, and patterned optical stimulation, in a wide field of view (FOV). Here we describe our 'Firefly' microscope, which achieves these goals in a Ø6 mm FOV. The Firefly optical system is optimized for simultaneous photostimulation and fluorescence imaging in cultured cells. All but one of the optical elements are commercially available, yet the microscope achieves 10-fold higher light collection efficiency at its design magnification than the comparable commercially available microscope using the same objective. The Firefly microscope enables all-optical electrophysiology ('Optopatch') in cultured neurons with a throughput and information content unmatched by other neuronal phenotyping systems. This capability opens possibilities in disease modeling and phenotypic drug screening. We also demonstrate applications of the system to voltage and calcium recordings in human induced pluripotent stem cell derived cardiomyocytes.

  18. Ultrawidefield microscope for high-speed fluorescence imaging and targeted optogenetic stimulation

    PubMed Central

    Werley, Christopher A.; Chien, Miao-Ping; Cohen, Adam E.

    2017-01-01

    The rapid increase in the number and quality of fluorescent reporters and optogenetic actuators has yielded a powerful set of tools for recording and controlling cellular state and function. To achieve the full benefit of these tools requires improved optical systems with high light collection efficiency, high spatial and temporal resolution, and patterned optical stimulation, in a wide field of view (FOV). Here we describe our ‘Firefly’ microscope, which achieves these goals in a Ø6 mm FOV. The Firefly optical system is optimized for simultaneous photostimulation and fluorescence imaging in cultured cells. All but one of the optical elements are commercially available, yet the microscope achieves 10-fold higher light collection efficiency at its design magnification than the comparable commercially available microscope using the same objective. The Firefly microscope enables all-optical electrophysiology (‘Optopatch’) in cultured neurons with a throughput and information content unmatched by other neuronal phenotyping systems. This capability opens possibilities in disease modeling and phenotypic drug screening. We also demonstrate applications of the system to voltage and calcium recordings in human induced pluripotent stem cell derived cardiomyocytes. PMID:29296505

  19. Conditions Database for the Belle II Experiment

    NASA Astrophysics Data System (ADS)

    Wood, L.; Elsethagen, T.; Schram, M.; Stephan, E.

    2017-10-01

    The Belle II experiment at KEK is preparing for first collisions in 2017. Processing the large amounts of data that will be produced will require conditions data to be readily available to systems worldwide in a fast and efficient manner that is straightforward for both the user and maintainer. The Belle II conditions database was designed with a straightforward goal: make it as easily maintainable as possible. To this end, HEP-specific software tools were avoided as much as possible and industry standard tools used instead. HTTP REST services were selected as the application interface, which provide a high-level interface to users through the use of standard libraries such as curl. The application interface itself is written in Java and runs in an embedded Payara-Micro Java EE application server. Scalability at the application interface is provided by use of Hazelcast, an open source In-Memory Data Grid (IMDG) providing distributed in-memory computing and supporting the creation and clustering of new application interface instances as demand increases. The IMDG provides fast and efficient access to conditions data via in-memory caching.

  20. Comparison of Implicit Schemes for the Incompressible Navier-Stokes Equations

    NASA Technical Reports Server (NTRS)

    Rogers, Stuart E.

    1995-01-01

    For a computational flow simulation tool to be useful in a design environment, it must be very robust and efficient. To develop such a tool for incompressible flow applications, a number of different implicit schemes are compared for several two-dimensional flow problems in the current study. The schemes include Point-Jacobi relaxation, Gauss-Seidel line relaxation, incomplete lower-upper decomposition, and the generalized minimum residual method preconditioned with each of the three other schemes. The efficiency of the schemes is measured in terms of the computing time required to obtain a steady-state solution for the laminar flow over a backward-facing step, the flow over a NACA 4412 airfoil, and the flow over a three-element airfoil using overset grids. The flow solver used in the study is the INS2D code that solves the incompressible Navier-Stokes equations using the method of artificial compressibility and upwind differencing of the convective terms. The results show that the generalized minimum residual method preconditioned with the incomplete lower-upper factorization outperforms all other methods by at least a factor of 2.

  1. Simulation of silicon thin-film solar cells for oblique incident waves

    NASA Astrophysics Data System (ADS)

    Jandl, Christine; Hertel, Kai; Pflaum, Christoph; Stiebig, Helmut

    2011-05-01

    To optimize the quantum efficiency (QE) and short-circuit current density (JSC) of silicon thin-film solar cells, one has to study the behavior of sunlight in these solar cells. Simulations are an adequate and economic method to analyze the optical properties of light caused by absorption and reflection. To this end a simulation tool is developed to take several demands into account. These include the analysis of perpendicular and oblique incident waves under E-, H- and circularly polarized light. Furthermore, the topology of the nanotextured interfaces influences the efficiency and therefore also the short-circuit current density. It is well known that a rough transparent conductive oxide (TCO) layer increases the efficiency of solar cells. Therefore, it is indispensable that various roughness profiles at the interfaces of the solar cell layers can be modeled in such a way that atomic force microscope (AFM) scan data can be integrated. Numerical calculations of Maxwell's equations based on the finite integration technique (FIT) and Finite Difference Time Domain (FDTD) method are necessary to incorporate all these requirements. The simulations are performed in parallel on high performance computers (HPC) to meet the large computational requirements.

  2. Model-based optimal design of experiments - semidefinite and nonlinear programming formulations

    PubMed Central

    Duarte, Belmiro P.M.; Wong, Weng Kee; Oliveira, Nuno M.C.

    2015-01-01

    We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D–, A– and E–optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D–optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice. PMID:26949279

  3. Improving Quality and Reducing Waste in Allied Health Workplace Education Programs: A Pragmatic Operational Education Framework Approach.

    PubMed

    Golder, Janet; Farlie, Melanie K; Sevenhuysen, Samantha

    2016-01-01

    Efficient utilisation of education resources is required for the delivery of effective learning opportunities for allied health professionals. This study aimed to develop an education framework to support delivery of high-quality education within existing education resources. This study was conducted in a large metropolitan health service. Homogenous and purposive sampling methods were utilised in Phase 1 (n=43) and 2 (n=14) consultation stages. Participants included 25 allied health professionals, 22 managers, 1 educator, and 3 executives. Field notes taken during 43 semi-structured interviews and 4 focus groups were member-checked, and semantic thematic analysis methods were utilised. Framework design was informed by existing published framework development guides. The framework model contains governance, planning, delivery, and evaluation and research elements and identifies performance indicators, practice examples, and support tools for a range of stakeholders. Themes integrated into framework content include improving quality of education and training provided and delivery efficiency, greater understanding of education role requirements, and workforce support for education-specific knowledge and skill development. This framework supports efficient delivery of allied health workforce education and training to the highest standard, whilst pragmatically considering current allied health education workforce demands.

  4. Model-based optimal design of experiments - semidefinite and nonlinear programming formulations.

    PubMed

    Duarte, Belmiro P M; Wong, Weng Kee; Oliveira, Nuno M C

    2016-02-15

    We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters. Mathematical programming techniques are then applied to solve the optimization problems. Because such methods require the design space be discretized, we also evaluate the impact of the discretization scheme on the generated design. We demonstrate the techniques for finding D -, A - and E -optimal designs using design problems in biochemical engineering and show the method can also be directly applied to tackle additional issues, such as heteroscedasticity in the model. Our results show that the NLP formulation produces highly efficient D -optimal designs but is computationally less efficient than that required for the SDP formulation. The efficiencies of the generated designs from the two methods are generally very close and so we recommend the SDP formulation in practice.

  5. High accurate interpolation of NURBS tool path for CNC machine tools

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Liu, Huan; Yuan, Songmei

    2016-09-01

    Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.

  6. Designing overall stoichiometric conversions and intervening metabolic reactions

    DOE PAGES

    Chowdhury, Anupam; Maranas, Costas D.

    2015-11-04

    Existing computational tools for de novo metabolic pathway assembly, either based on mixed integer linear programming techniques or graph-search applications, generally only find linear pathways connecting the source to the target metabolite. The overall stoichiometry of conversion along with alternate co-reactant (or co-product) combinations is not part of the pathway design. Therefore, global carbon and energy efficiency is in essence fixed with no opportunities to identify more efficient routes for recycling carbon flux closer to the thermodynamic limit. Here, we introduce a two-stage computational procedure that both identifies the optimum overall stoichiometry (i.e., optStoic) and selects for (non-)native reactions (i.e.,more » minRxn/minFlux) that maximize carbon, energy or price efficiency while satisfying thermodynamic feasibility requirements. Implementation for recent pathway design studies identified non-intuitive designs with improved efficiencies. Specifically, multiple alternatives for non-oxidative glycolysis are generated and non-intuitive ways of co-utilizing carbon dioxide with methanol are revealed for the production of C 2+ metabolites with higher carbon efficiency.« less

  7. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  8. High-power Broadband Organic THz Generator

    PubMed Central

    Jeong, Jae-Hyeok; Kang, Bong-Joo; Kim, Ji-Soo; Jazbinsek, Mojca; Lee, Seung-Heon; Lee, Seung-Chul; Baek, In-Hyung; Yun, Hoseop; Kim, Jongtaek; Lee, Yoon Sup; Lee, Jae-Hyeok; Kim, Jae-Ho; Rotermund, Fabian; Kwon, O-Pil

    2013-01-01

    The high-power broadband terahertz (THz) generator is an essential tool for a wide range of THz applications. Here, we present a novel highly efficient electro-optic quinolinium single crystal for THz wave generation. For obtaining intense and broadband THz waves by optical-to-THz frequency conversion, a quinolinium crystal was developed to fulfill all the requirements, which are in general extremely difficult to maintain simultaneously in a single medium, such as a large macroscopic electro-optic response and excellent crystal characteristics including a large crystal size with desired facets, good environmental stability, high optical quality, wide transparency range, and controllable crystal thickness. Compared to the benchmark inorganic and organic crystals, the new quinolinium crystal possesses excellent crystal properties and THz generation characteristics with broader THz spectral coverage and higher THz conversion efficiency at the technologically important pump wavelength of 800 nm. Therefore, the quinolinium crystal offers great potential for efficient and gap-free broadband THz wave generation. PMID:24220234

  9. A frequency dependent preconditioned wavelet method for atmospheric tomography

    NASA Astrophysics Data System (ADS)

    Yudytskiy, Mykhaylo; Helin, Tapio; Ramlau, Ronny

    2013-12-01

    Atmospheric tomography, i.e. the reconstruction of the turbulence in the atmosphere, is a main task for the adaptive optics systems of the next generation telescopes. For extremely large telescopes, such as the European Extremely Large Telescope, this problem becomes overly complex and an efficient algorithm is needed to reduce numerical costs. Recently, a conjugate gradient method based on wavelet parametrization of turbulence layers was introduced [5]. An iterative algorithm can only be numerically efficient when the number of iterations required for a sufficient reconstruction is low. A way to achieve this is to design an efficient preconditioner. In this paper we propose a new frequency-dependent preconditioner for the wavelet method. In the context of a multi conjugate adaptive optics (MCAO) system simulated on the official end-to-end simulation tool OCTOPUS of the European Southern Observatory we demonstrate robustness and speed of the preconditioned algorithm. We show that three iterations are sufficient for a good reconstruction.

  10. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    NASA Astrophysics Data System (ADS)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  11. How effective is mandatory building energy disclosure program in Australia?

    NASA Astrophysics Data System (ADS)

    Kim, S.; Lim, B. T. H.

    2018-04-01

    Mandatory green building regulations are often considered as the most effective tool to promote better energy efficiency and environmental protection. Nevertheless, its effectiveness compared to the voluntary counterpart has not been fully explored yet. In addressing this gap, this study aims to examine the environmental performance of green building stocks affected by the Australian mandatory building energy disclosure program. To this, this study analysed energy savings and carbon reduction efficiencies using the normalisation approach. The result shows that mandatory energy disclosure program did contribute to the reduction in energy usage and carbon emissions from the affected building stocks. More specifically, affected green building stocks showed a good efficiency especially in carbon reductions. The research results inform policymakers the possible improvement required for the mandatory disclosure program to increase the effectiveness towards dealing with the contemporary environmental issues aroused from the building sector, especially in energy savings perspective.

  12. Optimisation of a low cost SLM for diffraction efficiency and ghost order suppression

    NASA Astrophysics Data System (ADS)

    Bowman, R.; D'Ambrosio, V.; Rubino, E.; Jedrkiewicz, O.; di Trapani, P.; Padgett, M. J.

    2011-11-01

    Spatial Light Modulators (SLMs) are a powerful tool in many optics laboratories, but due to the technology required for their fabrication, they are usually very expensive. Recently some inexpensive devices have been produced, however their phase shift range is less than 2π, leading to a loss of diffraction efficiency for the SLM. We show how to improve the first order diffraction efficiency of such an SLM by adjusting the blazing function, and obtain a 1.5 times increase in first order diffracted power. Even a perfect SLM with 2π phase throw can produce undesired effects in some situations; for example in holographic optical tweezers it is common to find unwanted "ghost spots" near to the array of first-order spots. Modulating the amplitude, by spatially modulating the blazing function, allows us to suppress the ghost spots. This increases the contrast between desired and unwanted spots by more than an order of magnitude.

  13. EBF3 Design and Sustainability Considerations

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M. B.

    2015-01-01

    Electron beam freeform fabrication (EBF3) is a cross-cutting technology for producing structural metal parts using an electron beam and wire feed in a layer-additive fashion. This process was developed by researchers at NASA Langley to specifically address needs for aerospace applications. Additive manufacturing technologies like EBF3 enable efficient design of materials and structures by tailoring microstructures and chemistries at the local level to improve performance at the global level. Additive manufacturing also facilitates design freedom by integrating assemblies into complex single-piece components, eliminating flanges, fasteners and joints, resulting in reduced size and mass. These same efficiencies that permit new design paradigms also lend themselves to supportability and sustainability. Long duration space missions will require a high degree of self-sustainability. EBF3 is a candidate technology being developed to allow astronauts to conduct repairs and fabricate new components and tools on demand, with efficient use of feedstock materials and energy.

  14. High-power broadband organic THz generator.

    PubMed

    Jeong, Jae-Hyeok; Kang, Bong-Joo; Kim, Ji-Soo; Jazbinsek, Mojca; Lee, Seung-Heon; Lee, Seung-Chul; Baek, In-Hyung; Yun, Hoseop; Kim, Jongtaek; Lee, Yoon Sup; Lee, Jae-Hyeok; Kim, Jae-Ho; Rotermund, Fabian; Kwon, O-Pil

    2013-11-13

    The high-power broadband terahertz (THz) generator is an essential tool for a wide range of THz applications. Here, we present a novel highly efficient electro-optic quinolinium single crystal for THz wave generation. For obtaining intense and broadband THz waves by optical-to-THz frequency conversion, a quinolinium crystal was developed to fulfill all the requirements, which are in general extremely difficult to maintain simultaneously in a single medium, such as a large macroscopic electro-optic response and excellent crystal characteristics including a large crystal size with desired facets, good environmental stability, high optical quality, wide transparency range, and controllable crystal thickness. Compared to the benchmark inorganic and organic crystals, the new quinolinium crystal possesses excellent crystal properties and THz generation characteristics with broader THz spectral coverage and higher THz conversion efficiency at the technologically important pump wavelength of 800 nm. Therefore, the quinolinium crystal offers great potential for efficient and gap-free broadband THz wave generation.

  15. Bank supervision using the Threshold-Minimum Dominating Set

    NASA Astrophysics Data System (ADS)

    Gogas, Periklis; Papadimitriou, Theophilos; Matthaiou, Maria-Artemis

    2016-06-01

    An optimized, healthy and stable banking system resilient to financial crises is a prerequisite for sustainable growth. Minimization of (a) the associated systemic risk and (b) the propagation of contagion in the case of a banking crisis are necessary conditions to achieve this goal. Central Banks are in charge of this significant undertaking via a close and detailed monitoring of the banking network. In this paper, we propose the use of an auxiliary supervision/monitoring system that is both efficient with respect to the required resources and can promptly identify a set of banks that are in distress so that immediate and appropriate action can be taken by the supervising authority. We use the network defined by the interrelations between banking institutions employing tools from Complex Networks theory for an efficient management of the entire banking network. In doing so, we introduce the Threshold Minimum Dominating Set (T-MDS). The T-MDS is used to identify the smallest and most efficient subset of banks that can be used as (a) sensors of distress of a manifesting banking crisis and (b) provide a path of possible contagion. We propose the use of this method as a supplementary monitoring tool in the arsenal of a Central Bank. Our dataset includes the 122 largest American banks in terms of their interbank loans. The empirical results show that when the T-MDS methodology is applied, we can have an efficient supervision of the whole banking network, by monitoring just a subset of 47 banks.

  16. Benchmarking and Self-Assessment in the Wine Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galitsky, Christina; Radspieler, Anthony; Worrell, Ernst

    2005-12-01

    Not all industrial facilities have the staff or theopportunity to perform a detailed audit of their operations. The lack ofknowledge of energy efficiency opportunities provides an importantbarrier to improving efficiency. Benchmarking programs in the U.S. andabroad have shown to improve knowledge of the energy performance ofindustrial facilities and buildings and to fuel energy managementpractices. Benchmarking provides a fair way to compare the energyintensity of plants, while accounting for structural differences (e.g.,the mix of products produced, climate conditions) between differentfacilities. In California, the winemaking industry is not only one of theeconomic pillars of the economy; it is also a large energymore » consumer, witha considerable potential for energy-efficiency improvement. LawrenceBerkeley National Laboratory and Fetzer Vineyards developed the firstbenchmarking tool for the California wine industry called "BEST(Benchmarking and Energy and water Savings Tool) Winery". BEST Wineryenables a winery to compare its energy efficiency to a best practicereference winery. Besides overall performance, the tool enables the userto evaluate the impact of implementing efficiency measures. The toolfacilitates strategic planning of efficiency measures, based on theestimated impact of the measures, their costs and savings. The tool willraise awareness of current energy intensities and offer an efficient wayto evaluate the impact of future efficiency measures.« less

  17. Efficient Bayesian mixed model analysis increases association power in large cohorts

    PubMed Central

    Loh, Po-Ru; Tucker, George; Bulik-Sullivan, Brendan K; Vilhjálmsson, Bjarni J; Finucane, Hilary K; Salem, Rany M; Chasman, Daniel I; Ridker, Paul M; Neale, Benjamin M; Berger, Bonnie; Patterson, Nick; Price, Alkes L

    2014-01-01

    Linear mixed models are a powerful statistical tool for identifying genetic associations and avoiding confounding. However, existing methods are computationally intractable in large cohorts, and may not optimize power. All existing methods require time cost O(MN2) (where N = #samples and M = #SNPs) and implicitly assume an infinitesimal genetic architecture in which effect sizes are normally distributed, which can limit power. Here, we present a far more efficient mixed model association method, BOLT-LMM, which requires only a small number of O(MN)-time iterations and increases power by modeling more realistic, non-infinitesimal genetic architectures via a Bayesian mixture prior on marker effect sizes. We applied BOLT-LMM to nine quantitative traits in 23,294 samples from the Women’s Genome Health Study (WGHS) and observed significant increases in power, consistent with simulations. Theory and simulations show that the boost in power increases with cohort size, making BOLT-LMM appealing for GWAS in large cohorts. PMID:25642633

  18. Recent experience in simultaneous control-structure optimization

    NASA Technical Reports Server (NTRS)

    Salama, M.; Ramaker, R.; Milman, M.

    1989-01-01

    To show the feasibility of simultaneous optimization as design procedure, low order problems were used in conjunction with simple control formulations. The numerical results indicate that simultaneous optimization is not only feasible, but also advantageous. Such advantages come at the expense of introducing complexities beyond those encountered in structure optimization alone, or control optimization alone. Examples include: larger design parameter space, optimization may combine continuous and combinatoric variables, and the combined objective function may be nonconvex. Future extensions to include large order problems, more complex objective functions and constraints, and more sophisticated control formulations will require further research to ensure that the additional complexities do not outweigh the advantages of simultaneous optimization. Some areas requiring more efficient tools than currently available include: multiobjective criteria and nonconvex optimization. Efficient techniques to deal with optimization over combinatoric and continuous variables, and with truncation issues for structure and control parameters of both the model space as well as the design space need to be developed.

  19. Chance-Constrained AC Optimal Power Flow: Reformulations and Efficient Algorithms

    DOE PAGES

    Roald, Line Alnaes; Andersson, Goran

    2017-08-29

    Higher levels of renewable electricity generation increase uncertainty in power system operation. To ensure secure system operation, new tools that account for this uncertainty are required. Here, in this paper, we adopt a chance-constrained AC optimal power flow formulation, which guarantees that generation, power flows and voltages remain within their bounds with a pre-defined probability. We then discuss different chance-constraint reformulations and solution approaches for the problem. Additionally, we first discuss an analytical reformulation based on partial linearization, which enables us to obtain a tractable representation of the optimization problem. We then provide an efficient algorithm based on an iterativemore » solution scheme which alternates between solving a deterministic AC OPF problem and assessing the impact of uncertainty. This more flexible computational framework enables not only scalable implementations, but also alternative chance-constraint reformulations. In particular, we suggest two sample based reformulations that do not require any approximation or relaxation of the AC power flow equations.« less

  20. A new archival infrastructure for highly-structured astronomical data

    NASA Astrophysics Data System (ADS)

    Dovgan, Erik; Knapic, Cristina; Sponza, Massimo; Smareglia, Riccardo

    2018-03-01

    With the advent of the 2020 Radio Astronomy Telescopes era, the amount and format of the radioastronomical data is becoming a massive and performance-critical challenge. Such an evolution of data models and data formats require new data archiving techniques that allow massive and fast storage of data that are at the same time also efficiently processed. A useful expertise for efficient archiviation has been obtained through data archiving of Medicina and Noto Radio Telescopes. The presented archival infrastructure named the Radio Archive stores and handles various formats, such as FITS, MBFITS, and VLBI's XML, which includes description and ancillary files. The modeling and architecture of the archive fulfill all the requirements of both data persistence and easy data discovery and exploitation. The presented archive already complies with the Virtual Observatory directives, therefore future service implementations will also be VO compliant. This article presents the Radio Archive services and tools, from the data acquisition to the end-user data utilization.

  1. Microfluidic platform for efficient Nanodisc assembly, membrane protein incorporation, and purification.

    PubMed

    Wade, James H; Jones, Joshua D; Lenov, Ivan L; Riordan, Colleen M; Sligar, Stephen G; Bailey, Ryan C

    2017-08-22

    The characterization of integral membrane proteins presents numerous analytical challenges on account of their poor activity under non-native conditions, limited solubility in aqueous solutions, and low expression in most cell culture systems. Nanodiscs are synthetic model membrane constructs that offer many advantages for studying membrane protein function by offering a native-like phospholipid bilayer environment. The successful incorporation of membrane proteins within Nanodiscs requires experimental optimization of conditions. Standard protocols for Nanodisc formation can require large amounts of time and input material, limiting the facile screening of formation conditions. Capitalizing on the miniaturization and efficient mass transport inherent to microfluidics, we have developed a microfluidic platform for efficient Nanodisc assembly and purification, and demonstrated the ability to incorporate functional membrane proteins into the resulting Nanodiscs. In addition to working with reduced sample volumes, this platform simplifies membrane protein incorporation from a multi-stage protocol requiring several hours or days into a single platform that outputs purified Nanodiscs in less than one hour. To demonstrate the utility of this platform, we incorporated Cytochrome P450 into Nanodiscs of variable size and lipid composition, and present spectroscopic evidence for the functional active site of the membrane protein. This platform is a promising new tool for membrane protein biology and biochemistry that enables tremendous versatility for optimizing the incorporation of membrane proteins using microfluidic gradients to screen across diverse formation conditions.

  2. Reticles, write time, and the need for speed

    NASA Astrophysics Data System (ADS)

    Ackmann, Paul W.; Litt, Lloyd C.; Ning, Guo Xiang

    2014-10-01

    Historical data indicates reticle write times are increasing node-to-node. The cost of mask sets is increasing driven by the tighter requirements and more levels. The regular introduction of new generations of mask patterning tools with improved performance is unable to fully compensate for the increased data and complexity required. Write time is a primary metric that drives mask fabrication speed. Design (Raw data) is only the first step in the process and many interactions between mask and wafer technology such as OPC used, OPC efficiency for writers, fracture engines, and actual field size used drive total write time. Yield, technology, and inspection rules drive the remaining raw cycle time. Yield can be even more critical for speed of delivery as it drives re-writes and wasted time. While intrinsic process yield is important, repair capability is the reason mask delivery is still able to deliver 100% good reticles to the fab. Advanced nodes utilizing several layers of multiple patterning may require mask writer tool dedication to meet image placement specifications. This will increase the effective mask cycle time for a layer mask set and drive the need for additional mask write capability in order to deliver masks at the rate required by the wafer fab production schedules.

  3. The physiology and biomechanics of competitive swimming.

    PubMed

    Troup, J P

    1999-04-01

    Fast swimming, either in the pool, in open water swimming, or in water polo and synchronized swimming, requires maximizing the efficiencies with which the human body can move through a liquid medium. A multitude of factors can affect the ability to swim fast as well as the final outcome. Physiology and biomechanics are the present tools used by sports scientists to determine which factors are important to fast swimming and, subsequently, to determine how the swimmer may maximize these factors to improve performance.

  4. Engineering planetary lasers for interstellar communication

    NASA Technical Reports Server (NTRS)

    Sherwood, Brent; Mumma, Michael J.; Donaldson, Bruce K.

    1992-01-01

    Spacefaring skills evolved in the twenty-first century will enable missions of unprecedented complexity. One such elaborate project might be to develop tools for efficient interstellar data transfer. Informational links to other star systems would facilitate eventual human expansion beyond our solar system, as well as intercourse with potential extraterrestrial intelligence. This paper reports the major findings of a 600-page, 3-year, NASA-funded study examining in quantitative detail the requirements, some seemingly feasible methods, and implications of achieving reliable extrasolar communications.

  5. Attitude estimation of earth orbiting satellites by decomposed linear recursive filters

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1975-01-01

    Attitude estimation of earth orbiting satellites (including Large Space Telescope) subjected to environmental disturbances and noises was investigated. Modern control and estimation theory is used as a tool to design an efficient estimator for attitude estimation. Decomposed linear recursive filters for both continuous-time systems and discrete-time systems are derived. By using this accurate estimation of the attitude of spacecrafts, state variable feedback controller may be designed to achieve (or satisfy) high requirements of system performance.

  6. Structural Analysis Computer Programs for Rigid Multicomponent Pavement Structures with Discontinuities--WESLIQID and WESLAYER. Report 1. Program Development and Numerical Presentations.

    DTIC Science & Technology

    1981-05-01

    represented as a Winkler foundation. The program can treat any number of slabs connected by steel bars or other load trans- fer devices at the joints...dimensional finite element method. The inherent flexibility of such an approach permits the analysis of a rigid pavement with steel bars and stabilized...layers and provides an efficient tool for analyzing stress conditions at the joint. Unfor- tunately, such a procedure would require a tremendously

  7. Open-Source Programming for Automated Generation of Graphene Raman Spectral Maps

    NASA Astrophysics Data System (ADS)

    Vendola, P.; Blades, M.; Pierre, W.; Jedlicka, S.; Rotkin, S. V.

    Raman microscopy is a useful tool for studying the structural characteristics of graphene deposited onto substrates. However, extracting useful information from the Raman spectra requires data processing and 2D map generation. An existing home-built confocal Raman microscope was optimized for graphene samples and programmed to automatically generate Raman spectral maps across a specified area. In particular, an open source data collection scheme was generated to allow the efficient collection and analysis of the Raman spectral data for future use. NSF ECCS-1509786.

  8. Electron-Muon Ranger: Performance in the MICE muon beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, D.

    2015-12-16

    The Muon Ionization Cooling Experiment (MICE) will perform a detailed study of ionization cooling to evaluate the feasibility of the technique. To carry out this program, MICE requires an efficient particle-identification (PID) system to identify muons. The Electron-Muon Ranger (EMR) is a fully-active tracking-calorimeter that forms part of the PID system and tags muons that traverse the cooling channel without decaying. The detector is capable of identifying electrons with an efficiency of 98.6%, providing a purity for the MICE beam that exceeds 99.8%. Lastly, the EMR also proved to be a powerful tool for the reconstruction of muon momenta inmore » the range 100–280 MeV/c.« less

  9. An automated dose tracking system for adaptive radiation therapy.

    PubMed

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Problems in characterizing barrier performance

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1988-01-01

    The barrier is a synchronization construct which is useful in separating a parallel program into parallel sections which are executed in sequence. The completion of a barrier requires cooperation among all executing processes. This requirement not only introduces the wait for the slowest process delay which is inherent in the definition of the synchronization, but also has implications for the efficient implementation and measurement of barrier performance in different systems. Types of barrier implementation and their relationship to different multiprocessor environments are described. Then the problem of measuring the performance of barrier implementations on specific machine architecture is discussed. The fact that the barrier synchronization requires the cooperation of all processes makes the problem of performance measurement similarly global. Making non-intrusive measurements of sufficient accuracy can be tricky on systems offering only rudimentary measurement tools.

  11. Design of a medical and laboratory equipment management program for the new standards certification achievement in Mexico.

    PubMed

    Franco-Clark, D; Pimentel-Aguilar, A B; Rodriguez-Vera, R

    2010-01-01

    Certification for healthcare institutions in Mexico is ruled by 2009 standards homologated with the Joint Commission International criteria. Nowadays, healthcare requires of medical equipment and devices, so it has become necessary to implement guidelines for its adequate management in order to reach the highest level of quality and safety at the lowest cost. The objective of this work was to develop a Medical and Laboratory Equipment Management Program, oriented to the improvement of quality, effectiveness and efficiency of the technological resources in order to meet the certification requirements. The result of this work allows to have an auto evaluation tool that focuses the efforts of the National Institute for Respiratory Diseases to the achievement of the new requirements established for the certification.

  12. An online network tool for quality information to answer questions about occupational safety and health: usability and applicability.

    PubMed

    Rhebergen, Martijn D F; Hulshof, Carel T J; Lenderink, Annet F; van Dijk, Frank J H

    2010-10-22

    Common information facilities do not always provide the quality information needed to answer questions on health or health-related issues, such as Occupational Safety and Health (OSH) matters. Barriers may be the accessibility, quantity and readability of information. Online Question & Answer (Q&A) network tools, which link questioners directly to experts can overcome some of these barriers. When designing and testing online tools, assessing the usability and applicability is essential. Therefore, the purpose of this study is to assess the usability and applicability of a new online Q&A network tool for answers on OSH questions. We applied a cross-sectional usability test design. Eight occupational health experts and twelve potential questioners from the working population (workers) were purposively selected to include a variety of computer- and internet-experiences. During the test, participants were first observed while executing eight tasks that entailed important features of the tool. In addition, they were interviewed. Through task observations and interviews we assessed applicability, usability (effectiveness, efficiency and satisfaction) and facilitators and barriers in use. Most features were usable, though several could be improved. Most tasks were executed effectively. Some tasks, for example searching stored questions in categories, were not executed efficiently and participants were less satisfied with the corresponding features. Participants' recommendations led to improvements. The tool was found mostly applicable for additional information, to observe new OSH trends and to improve contact between OSH experts and workers. Hosting and support by a trustworthy professional organization, effective implementation campaigns, timely answering and anonymity were seen as important use requirements. This network tool is a promising new strategy for offering company workers high quality information to answer OSH questions. Q&A network tools can be an addition to existing information facilities in the field of OSH, but also to other healthcare fields struggling with how to answer questions from people in practice with high quality information. In the near future, we will focus on the use of the tool and its effects on information and knowledge dissemination.

  13. Modeling Complex Workflow in Molecular Diagnostics

    PubMed Central

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  14. tmBioC: improving interoperability of text-mining tools with BioC.

    PubMed

    Khare, Ritu; Wei, Chih-Hsuan; Mao, Yuqing; Leaman, Robert; Lu, Zhiyong

    2014-01-01

    The lack of interoperability among biomedical text-mining tools is a major bottleneck in creating more complex applications. Despite the availability of numerous methods and techniques for various text-mining tasks, combining different tools requires substantial efforts and time owing to heterogeneity and variety in data formats. In response, BioC is a recent proposal that offers a minimalistic approach to tool interoperability by stipulating minimal changes to existing tools and applications. BioC is a family of XML formats that define how to present text documents and annotations, and also provides easy-to-use functions to read/write documents in the BioC format. In this study, we introduce our text-mining toolkit, which is designed to perform several challenging and significant tasks in the biomedical domain, and repackage the toolkit into BioC to enhance its interoperability. Our toolkit consists of six state-of-the-art tools for named-entity recognition, normalization and annotation (PubTator) of genes (GenNorm), diseases (DNorm), mutations (tmVar), species (SR4GN) and chemicals (tmChem). Although developed within the same group, each tool is designed to process input articles and output annotations in a different format. We modify these tools and enable them to read/write data in the proposed BioC format. We find that, using the BioC family of formats and functions, only minimal changes were required to build the newer versions of the tools. The resulting BioC wrapped toolkit, which we have named tmBioC, consists of our tools in BioC, an annotated full-text corpus in BioC, and a format detection and conversion tool. Furthermore, through participation in the 2013 BioCreative IV Interoperability Track, we empirically demonstrate that the tools in tmBioC can be more efficiently integrated with each other as well as with external tools: Our experimental results show that using BioC reduces >60% in lines of code for text-mining tool integration. The tmBioC toolkit is publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Database URL: http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/Demo/tmTools/. Published by Oxford University Press 2014. This work is written by US Government employees and is in the public domain in the US.

  15. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    PubMed

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  16. Educational software usability: Artifact or Design?

    PubMed

    Van Nuland, Sonya E; Eagleson, Roy; Rogers, Kem A

    2017-03-01

    Online educational technologies and e-learning tools are providing new opportunities for students to learn worldwide, and they continue to play an important role in anatomical sciences education. Yet, as we shift to teaching online, particularly within the anatomical sciences, it has become apparent that e-learning tool success is based on more than just user satisfaction and preliminary learning outcomes-rather it is a multidimensional construct that should be addressed from an integrated perspective. The efficiency, effectiveness and satisfaction with which a user can navigate an e-learning tool is known as usability, and represents a construct which we propose can be used to quantitatively evaluate e-learning tool success. To assess the usability of an e-learning tool, usability testing should be employed during the design and development phases (i.e., prior to its release to users) as well as during its delivery (i.e., following its release to users). However, both the commercial educational software industry and individual academic developers in the anatomical sciences have overlooked the added value of additional usability testing. Reducing learner frustration and anxiety during e-learning tool use is essential in ensuring e-learning tool success, and will require a commitment on the part of the developers to engage in usability testing during all stages of an e-learning tool's life cycle. Anat Sci Educ 10: 190-199. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  17. Understanding Building Infrastructure and Building Operation through DOE Asset Score Model: Lessons Learned from a Pilot Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Na; Goel, Supriya; Gorrissen, Willy J.

    2013-06-24

    The U.S. Department of Energy (DOE) is developing a national voluntary energy asset score system to help building owners to evaluate the as-built physical characteristics (including building envelope, the mechanical and electrical systems) and overall building energy efficiency, independent of occupancy and operational choices. The energy asset score breaks down building energy use information by simulating building performance under typical operating and occupancy conditions for a given use type. A web-based modeling tool, the energy asset score tool facilitates the implementation of the asset score system. The tool consists of a simplified user interface built on a centralized simulation enginemore » (EnergyPlus). It is intended to reduce both the implementation cost for the users and increase modeling standardization compared with an approach that requires users to build their own energy models. A pilot project with forty-two buildings (consisting mostly offices and schools) was conducted in 2012. This paper reports the findings. Participants were asked to collect a minimum set of building data and enter it into the asset score tool. Participants also provided their utility bills, existing ENERGY STAR scores, and previous energy audit/modeling results if available. The results from the asset score tool were compared with the building energy use data provided by the pilot participants. Three comparisons were performed. First, the actual building energy use, either from the utility bills or via ENERGY STAR Portfolio Manager, was compared with the modeled energy use. It was intended to examine how well the energy asset score represents a building’s system efficiencies, and how well it is correlated to a building’s actual energy consumption. Second, calibrated building energy models (where they exist) were used to examine any discrepancies between the asset score model and the pilot participant buildings’ [known] energy use pattern. This comparison examined the end use breakdowns and more detailed time series data. Third, ASHRAE 90.1 prototype buildings were also used as an industry standard modeling approach to test the accuracy level of the asset score tool. Our analysis showed that the asset score tool, which uses simplified building simulation, could provide results comparable to a more detailed energy model. The buildings’ as-built efficiency can be reflected in the energy asset score. An analysis between the modeled energy use through the asset score tool and the actual energy use from the utility bills can further inform building owners about the effectiveness of their building’s operation and maintenance.« less

  18. Design of an intelligent information system for in-flight emergency assistance

    NASA Technical Reports Server (NTRS)

    Feyock, Stefan; Karamouzis, Stamos

    1991-01-01

    The present research has as its goal the development of AI tools to help flight crews cope with in-flight malfunctions. The relevant tasks in such situations include diagnosis, prognosis, and recovery plan generation. Investigation of the information requirements of these tasks has shown that the determination of paths figures largely: what components or systems are connected to what others, how are they connected, whether connections satisfying certain criteria exist, and a number of related queries. The formulation of such queries frequently requires capabilities of the second-order predicate calculus. An information system is described that features second-order logic capabilities, and is oriented toward efficient formulation and execution of such queries.

  19. Blue and green infrastructures implementation to solve stormwater management issues in a new urban development project - a modelling approach

    NASA Astrophysics Data System (ADS)

    Versini, Pierre-Antoine; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2016-04-01

    Concentrating buildings and socio-economic activities, urban areas are particularly vulnerable to hydrological risks. Modification in climate may intensify already existing issues concerning stormwater management (due to impervious area) and water supply (due to the increase of the population). In this context, water use efficiency and best water management practices are key-issues in the urban environment already stressed. Blue and green infrastructures are nature-based solutions that provide synergy of the blue and green systems to provide multifunctional solutions and multiple benefits: increased amenity, urban heat island improvement, biodiversity, reduced energy requirements... They are particularly efficient to reduce the potential impact of new and existing developments with respect to stormwater and/or water supply issues. The Multi-Hydro distributed rainfall-runoff model represents an adapted tool to manage the impacts of such infrastructures at the urban basin scale. It is a numerical platform that makes several models interact, each of them representing a specific portion of the water cycle in an urban environment: surface runoff and infiltration depending on a land use classification, sub-surface processes and sewer network drainage. Multi-Hydro is still being developed at the Ecole des Ponts (open access from https://hmco.enpc.fr/Tools-Training/Tools/Multi-Hydro.php) to take into account the wide complexity of urban environments. The latest advancements have made possible the representation of several blue and green infrastructures (green roof, basin, swale). Applied in a new urban development project located in the Paris region, Multi-Hydro has been used to simulate the impact of blue and green infrastructures implementation. It was particularly focused on their ability to fulfil regulation rules established by local stormwater managers in order to connect the parcel to the sewer network. The results show that a combination of several blue and green infrastructures, if they are widely implemented, could represent an efficient tool to ensure regulation rules at the parcel scale.

  20. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  1. VarDetect: a nucleotide sequence variation exploratory tool

    PubMed Central

    Ngamphiw, Chumpol; Kulawonganunchai, Supasak; Assawamakin, Anunchai; Jenwitheesuk, Ekachai; Tongsima, Sissades

    2008-01-01

    Background Single nucleotide polymorphisms (SNPs) are the most commonly studied units of genetic variation. The discovery of such variation may help to identify causative gene mutations in monogenic diseases and SNPs associated with predisposing genes in complex diseases. Accurate detection of SNPs requires software that can correctly interpret chromatogram signals to nucleotides. Results We present VarDetect, a stand-alone nucleotide variation exploratory tool that automatically detects nucleotide variation from fluorescence based chromatogram traces. Accurate SNP base-calling is achieved using pre-calculated peak content ratios, and is enhanced by rules which account for common sequence reading artifacts. The proposed software tool is benchmarked against four other well-known SNP discovery software tools (PolyPhred, novoSNP, Genalys and Mutation Surveyor) using fluorescence based chromatograms from 15 human genes. These chromatograms were obtained from sequencing 16 two-pooled DNA samples; a total of 32 individual DNA samples. In this comparison of automatic SNP detection tools, VarDetect achieved the highest detection efficiency. Availability VarDetect is compatible with most major operating systems such as Microsoft Windows, Linux, and Mac OSX. The current version of VarDetect is freely available at . PMID:19091032

  2. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  3. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  4. A geographic information system screening tool to tackle diffuse pollution through the use of sustainable drainage systems.

    PubMed

    Todorovic, Zorica; Breton, Neil P

    2014-01-01

    Sustainable drainage systems (SUDS) offer many benefits that traditional solutions do not. Traditional approaches are unable to offer a solution to problems of flood management and water quality. Holistic consideration of the wide range of benefits from SUDS can result in advantages such as improved flood resilience and water quality enhancement through consideration of diffuse pollution sources. Using a geographical information system (GIS) approach, diffuse pollutant sources and opportunities for SUDS are easily identified. Consideration of potential SUDS locations results in source, site and regional controls, leading to improved water quality (to meet Water Framework Directive targets). The paper will discuss two different applications of the tool, the first of which is where the pollutant of interest is known. In this case the outputs of the tool highlight and isolate the areas contributing the pollutants and suggest the adequate SUDS measures to meet the required criteria. The second application is where the tool identifies likely pollutants at a receiving location, and SUDS measures are proposed to reduce pollution with assessed efficiencies.

  5. Progress in development of coated indexable cemented carbide inserts for machining of iron based work piece materials

    NASA Astrophysics Data System (ADS)

    Czettl, C.; Pohler, M.

    2016-03-01

    Increasing demands on material properties of iron based work piece materials, e.g. for the turbine industry, complicate the machining process and reduce the lifetime of the cutting tools. Therefore, improved tool solutions, adapted to the requirements of the desired application have to be developed. Especially, the interplay of macro- and micro geometry, substrate material, coating and post treatment processes is crucial for the durability of modern high performance tool solutions. Improved and novel analytical methods allow a detailed understanding of material properties responsible for the wear behaviour of the tools. Those support the knowledge based development of tailored cutting materials for selected applications. One important factor for such a solution is the proper choice of coating material, which can be synthesized by physical or chemical vapor deposition techniques. Within this work an overview of state-of-the-art coated carbide grades is presented and application examples are shown to demonstrate their high efficiency. Machining processes for a material range from cast iron, low carbon steels to high alloyed steels are covered.

  6. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  7. Analysis and Simulation of Traffic Control for Resource Management in DVB-Based Broadband Satellite Access Networks

    NASA Astrophysics Data System (ADS)

    Impemba, Ernesto; Inzerilli, Tiziano

    2003-07-01

    Integration of satellite access networks with the Internet is seen as a strategic goal to achieve in order to provide ubiquitous broadband access to Internet services in Next Generation Networks (NGNs). One of the main interworking aspects which has been most studied is an efficient management of satellite resources, i.e. bandwidth and buffer space, in order to satisfy most demanding application requirements as to delay control and bandwidth assurance. In this context, resource management in DVB-S/DVB-RCS satellite technologies, emerging technologies for broadband satellite access and transport of IP applications, is a research issue largely investigated as a means to provide efficient bi-directional communications across satellites. This is in particular one of the principal goals of the SATIP6 project, sponsored within the 5th EU Research Programme Framework, i.e. IST. In this paper we present a possible approach to efficiently exploit bandwidth, the most critical resource in a broadband satellite access network, while pursuing satisfaction of delay and bandwidth requirements for applications with guaranteed QoS through a traffic control architecture to be implemented in ground terminals. Performance of this approach is assessed in terms of efficient exploitation of the uplink bandwidth and differentiation and minimization of queuing delays for most demanding applications over a time-varying capacity. Opnet simulations is used as analysis tool.

  8. Computer assisted audit techniques for UNIX (UNIX-CAATS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polk, W.T.

    1991-12-31

    Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General`s Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results ofmore » testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.« less

  9. Computer assisted audit techniques for UNIX (UNIX-CAATS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polk, W.T.

    1991-01-01

    Federal and DOE regulations impose specific requirements for internal controls of computer systems. These controls include adequate separation of duties and sufficient controls for access of system and data. The DOE Inspector General's Office has the responsibility to examine internal controls, as well as efficient use of computer system resources. As a result, DOE supported NIST development of computer assisted audit techniques to examine BSD UNIX computers (UNIX-CAATS). These systems were selected due to the increasing number of UNIX workstations in use within DOE. This paper describes the design and development of these techniques, as well as the results ofmore » testing at NIST and the first audit at a DOE site. UNIX-CAATS consists of tools which examine security of passwords, file systems, and network access. In addition, a tool was developed to examine efficiency of disk utilization. Test results at NIST indicated inadequate password management, as well as weak network resource controls. File system security was considered adequate. Audit results at a DOE site indicated weak password management and inefficient disk utilization. During the audit, we also found improvements to UNIX-CAATS were needed when applied to large systems. NIST plans to enhance the techniques developed for DOE/IG in future work. This future work would leverage currently available tools, along with needed enhancements. These enhancements would enable DOE/IG to audit large systems, such as supercomputers.« less

  10. Improved regional-scale Brazilian cropping systems' mapping based on a semi-automatic object-based clustering approach

    NASA Astrophysics Data System (ADS)

    Bellón, Beatriz; Bégué, Agnès; Lo Seen, Danny; Lebourgeois, Valentine; Evangelista, Balbino Antônio; Simões, Margareth; Demonte Ferraz, Rodrigo Peçanha

    2018-06-01

    Cropping systems' maps at fine scale over large areas provide key information for further agricultural production and environmental impact assessments, and thus represent a valuable tool for effective land-use planning. There is, therefore, a growing interest in mapping cropping systems in an operational manner over large areas, and remote sensing approaches based on vegetation index time series analysis have proven to be an efficient tool. However, supervised pixel-based approaches are commonly adopted, requiring resource consuming field campaigns to gather training data. In this paper, we present a new object-based unsupervised classification approach tested on an annual MODIS 16-day composite Normalized Difference Vegetation Index time series and a Landsat 8 mosaic of the State of Tocantins, Brazil, for the 2014-2015 growing season. Two variants of the approach are compared: an hyperclustering approach, and a landscape-clustering approach involving a previous stratification of the study area into landscape units on which the clustering is then performed. The main cropping systems of Tocantins, characterized by the crop types and cropping patterns, were efficiently mapped with the landscape-clustering approach. Results show that stratification prior to clustering significantly improves the classification accuracies for underrepresented and sparsely distributed cropping systems. This study illustrates the potential of unsupervised classification for large area cropping systems' mapping and contributes to the development of generic tools for supporting large-scale agricultural monitoring across regions.

  11. An Efficient, Rapid, and Recyclable System for CRISPR-Mediated Genome Editing in Candida albicans.

    PubMed

    Nguyen, Namkha; Quail, Morgan M F; Hernday, Aaron D

    2017-01-01

    Candida albicans is the most common fungal pathogen of humans. Historically, molecular genetic analysis of this important pathogen has been hampered by the lack of stable plasmids or meiotic cell division, limited selectable markers, and inefficient methods for generating gene knockouts. The recent development of clustered regularly interspaced short palindromic repeat(s) (CRISPR)-based tools for use with C. albicans has opened the door to more efficient genome editing; however, previously reported systems have specific limitations. We report the development of an optimized CRISPR-based genome editing system for use with C. albicans . Our system is highly efficient, does not require molecular cloning, does not leave permanent markers in the genome, and supports rapid, precise genome editing in C. albicans . We also demonstrate the utility of our system for generating two independent homozygous gene knockouts in a single transformation and present a method for generating homozygous wild-type gene addbacks at the native locus. Furthermore, each step of our protocol is compatible with high-throughput strain engineering approaches, thus opening the door to the generation of a complete C. albicans gene knockout library. IMPORTANCE Candida albicans is the major fungal pathogen of humans and is the subject of intense biomedical and discovery research. Until recently, the pace of research in this field has been hampered by the lack of efficient methods for genome editing. We report the development of a highly efficient and flexible genome editing system for use with C. albicans . This system improves upon previously published C. albicans CRISPR systems and enables rapid, precise genome editing without the use of permanent markers. This new tool kit promises to expedite the pace of research on this important fungal pathogen.

  12. On-Line Tool for the Assessment of Radiation in Space - Deep Space Mission Enhancements

    NASA Technical Reports Server (NTRS)

    Sandridge, Chris a.; Blattnig, Steve R.; Norman, Ryan B.; Slaba, Tony C.; Walker, Steve A.; Spangler, Jan L.

    2011-01-01

    The On-Line Tool for the Assessment of Radiation in Space (OLTARIS, https://oltaris.nasa.gov) is a web-based set of tools and models that allows engineers and scientists to assess the effects of space radiation on spacecraft, habitats, rovers, and spacesuits. The site is intended to be a design tool for those studying the effects of space radiation for current and future missions as well as a research tool for those developing advanced material and shielding concepts. The tools and models are built around the HZETRN radiation transport code and are primarily focused on human- and electronic-related responses. The focus of this paper is to highlight new capabilities that have been added to support deep space (outside Low Earth Orbit) missions. Specifically, the electron, proton, and heavy ion design environments for the Europa mission have been incorporated along with an efficient coupled electron-photon transport capability to enable the analysis of complicated geometries and slabs exposed to these environments. In addition, a neutron albedo lunar surface environment was also added, that will be of value for the analysis of surface habitats. These updates will be discussed in terms of their implementation and on how OLTARIS can be used by instrument vendors, mission designers, and researchers to analyze their specific requirements.12

  13. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  14. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  15. Electric power market agent design

    NASA Astrophysics Data System (ADS)

    Oh, Hyungseon

    The electric power industry in many countries has been restructured in the hope of a more economically efficient system. In the restructured system, traditional operating and planning tools based on true marginal cost do not perform well since information required is strictly confidential. For developing a new tool, it is necessary to understand offer behavior. The main objective of this study is to create a new tool for power system planning. For the purpose, this dissertation develops models for a market and market participants. A new model is developed in this work for explaining a supply-side offer curve, and several variables are introduced to characterize the curve. Demand is estimated using a neural network, and a numerical optimization process is used to determine the values of the variables that maximize the profit of the agent. The amount of data required for the optimization is chosen with the aid of nonlinear dynamics. To suggest an optimal demand-side bidding function, two optimization problems are constructed and solved for maximizing consumer satisfaction based on the properties of two different types of demands: price-based demand and must-be-served demand. Several different simulations are performed to test how an agent reacts in various situations. The offer behavior depends on locational benefit as well as the offer strategies of competitors.

  16. Benefits of Efficient Windows | Efficient Windows Collaborative

    Science.gov Websites

    Foundry Foundry New Construction Windows Window Selection Tool Selection Process Design Guidance Installation Replacement Windows Window Selection Tool Assessing Options Selection Process Design Guidance Installation Understanding Windows Benefits Design Considerations Measuring Performance Performance Standards

  17. Inspection planning development: An evolutionary approach using reliability engineering as a tool

    NASA Technical Reports Server (NTRS)

    Graf, David A.; Huang, Zhaofeng

    1994-01-01

    This paper proposes an evolutionary approach for inspection planning which introduces various reliability engineering tools into the process and assess system trade-offs among reliability, engineering requirement, manufacturing capability and inspection cost to establish an optimal inspection plan. The examples presented in the paper illustrate some advantages and benefits of the new approach. Through the analysis, reliability and engineering impacts due to manufacturing process capability and inspection uncertainty are clearly understood; the most cost effective and efficient inspection plan can be established and associated risks are well controlled; some inspection reductions and relaxations are well justified; and design feedbacks and changes may be initiated from the analysis conclusion to further enhance reliability and reduce cost. The approach is particularly promising as global competitions and customer quality improvement expectations are rapidly increasing.

  18. Comparison of clinical knowledge bases for summarization of electronic health records.

    PubMed

    McCoy, Allison B; Sittig, Dean F; Wright, Adam

    2013-01-01

    Automated summarization tools that create condition-specific displays may improve clinician efficiency. These tools require new kinds of knowledge that is difficult to obtain. We compared five problem-medication pair knowledge bases generated using four previously described knowledge base development approaches. The number of pairs in the resulting mapped knowledge bases varied widely due to differing mapping techniques from the source terminologies, ranging from 2,873 to 63,977,738 pairs. The number of overlapping pairs across knowledge bases was low, with one knowledge base having half of the pairs overlapping with another knowledge base, and most having less than a third overlapping. Further research is necessary to better evaluate the knowledge bases independently in additional settings, and to identify methods to integrate the knowledge bases.

  19. Materials Genome Initiative

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    The Materials Genome Initiative (MGI) project element is a cross-Center effort that is focused on the integration of computational tools to simulate manufacturing processes and materials behavior. These computational simulations will be utilized to gain understanding of processes and materials behavior to accelerate process development and certification to more efficiently integrate new materials in existing NASA projects and to lead to the design of new materials for improved performance. This NASA effort looks to collaborate with efforts at other government agencies and universities working under the national MGI. MGI plans to develop integrated computational/experimental/ processing methodologies for accelerating discovery and insertion of materials to satisfy NASA's unique mission demands. The challenges include validated design tools that incorporate materials properties, processes, and design requirements; and materials process control to rapidly mature emerging manufacturing methods and develop certified manufacturing processes

  20. New endoscopic and cytologic tools for cancer surveillance in the digestive tract

    PubMed Central

    Brentnall, Teresa A.; Dominitz, Jason A.

    2009-01-01

    Synopsis Cancer surveillance is an increasing part of everyday practice in gastrointestinal endoscopy due to the identification of high risk groups from genetic and biomarker testing, genealogic and epidemiologic studies, and the increasing number of cancer survivors. An efficient surveillance program requires a cost-effective means for image-guided cancer detection and biopsy. A laser-based tethered-capsule endoscope with enhanced spectral imaging is introduced for unsedated surveillance of the lower esophagus. An ultrathin version of this same endoscope technology provides a 1.2-mm guidewire with imaging capability and cannula-style tools are proposed for image-guided biopsy. Advanced 3D cell visualization techniques are described for increasing the sensitivity of early cancer diagnosis from hematoxylin-stained cells sampled from the pancreatic and biliary ducts. PMID:19423026

  1. A new model for programming software in body sensor networks.

    PubMed

    de A Barbosa, Talles M G; Sene, Iwens G; da Rocha, Adson F; de O Nascimento, Francisco A A; Carvalho, Joao L A; Carvalho, Hervaldo S

    2007-01-01

    A Body Sensor Network (BSN) must be designed to work autonomously. On the other hand, BSNs need mechanisms that allow changes in their behavior in order to become a clinically useful tool. The purpose of this paper is to present a new programming model that will be useful for programming BSN sensor nodes. This model is based on an intelligent intermediate-level compiler. The main purpose of the proposed compiler is to increase the efficiency in system use, and to increase the lifetime of the application, considering its requirements, hardware possibilities and specialist knowledge. With this model, it is possible to maintain the autonomous operation capability of the BSN and still offer tools that allow users with little grasp on programming techniques to program these systems.

  2. Microplastic Exposure Assessment in Aquatic Environments: Learning from Similarities and Differences to Engineered Nanoparticles.

    PubMed

    Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo

    2017-03-07

    Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.

  3. [Tools for laparoscopic skill development - available trainers and simulators].

    PubMed

    Jaksa, László; Haidegger, Tamás; Galambos, Péter; Kiss, Rita

    2017-10-01

    The laparoscopic minimally invasive surgical technique is widely employed on a global scale. However, the efficient and ethical teaching of this technique requires equipment for surgical simulation. These educational devices are present on the market in the form of box trainers and virtual reality simulators, or some combination of those. In this article, we present a systematic overview of commercially available surgical simulators describing the most important features of each product. Our overview elaborates on box trainers and virtual reality simulators, and also touches on surgical robotics simulators, together with operating room workflow simulators, for the sake of completeness. Apart from presenting educational tools, we evaluated the literature of laparoscopic surgical education and simulation, to provide a complete picture of the unfolding trends in this field. Orv Hetil. 2017; 158(40): 1570-1576.

  4. Image Navigation and Registration Performance Assessment Evaluation Tools for GOES-R ABI and GLM

    NASA Technical Reports Server (NTRS)

    Houchin, Scott; Porter, Brian; Graybill, Justin; Slingerland, Philip

    2017-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. This paper describes the software design and implementation of IPATS and provides preliminary test results.

  5. [Managing a health research institute: towards research excellence through continuous improvement].

    PubMed

    Olmedo, Carmen; Buño, Ismael; Plá, Rosa; Lomba, Irene; Bardinet, Thierry; Bañares, Rafael

    2015-01-01

    Health research institutes are a strategic commitment considered the ideal environment to develop excellence in translational research. Achieving quality research requires not only a powerful scientific and research structure but also the quality and integrity of management systems that support it. The essential instruments in our institution were solid strategic planning integrated into and consistent with the system of quality management, systematic evaluation through periodic indicators, measurement of key user satisfaction and internal audits, and implementation of an innovative information management tool. The implemented management tools have provided a strategic thrust to our institute while ensuring a level of quality and efficiency in the development and management of research that allows progress towards excellence in biomedical research. Copyright © 2015 SESPAS. Published by Elsevier Espana. All rights reserved.

  6. Tool Efficiency Analysis model research in SEMI industry

    NASA Astrophysics Data System (ADS)

    Lei, Ma; Nana, Zhang; Zhongqiu, Zhang

    2018-06-01

    One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  7. Patient care transitions from the emergency department to the medicine ward: evaluation of a standardized electronic signout tool.

    PubMed

    Gonzalo, Jed D; Yang, Julius J; Stuckey, Heather L; Fischer, Christopher M; Sanchez, Leon D; Herzig, Shoshana J

    2014-08-01

    To evaluate the impact of a new electronic handoff tool for emergency department to medicine ward patient transfers over a 1-year period. Prospective mixed-methods analysis of data submitted by medicine residents following admitting shifts before and after eSignout implementation. University-based, tertiary-care hospital. Internal medicine resident physicians admitting patients from the emergency department. An electronic handoff tool (eSignout) utilizing automated paging communication and responsibility acceptance without mandatory verbal communication between emergency department and medicine ward providers. (i) Incidence of reported near misses/adverse events, (ii) communication of key clinical information and quality of verbal communication and (iii) characterization of near misses/adverse events. Seventy-eight of 80 surveys (98%) and 1058 of 1388 surveys (76%) were completed before and after eSignout implementation. Compared with pre-intervention, residents in the post-intervention period reported similar number of shifts with a near miss/adverse event (10.3 vs. 7.8%; P = 0.27), similar communication of key clinical information, and improved verbal signout quality, when it occurred. Compared with the former process requiring mandatory verbal communication, 93% believed the eSignout was more efficient and 61% preferred the eSignout. Patient safety issues related to perceived sufficiency/accuracy of diagnosis, treatment or disposition, and information quality. The eSignout was perceived as more efficient and preferred over the mandatory verbal signout process. Rates of reported adverse events were similar before and after the intervention. Our experience suggests electronic platforms with optional verbal communication can be used to standardize and improve the perceived efficiency of patient handoffs. © The Author 2014. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  8. The potential for using canopy spectral reflectance as an indirect selection tool for yield improvement in winter wheat

    NASA Astrophysics Data System (ADS)

    Prasad, Bishwajit

    Scope and methods of study. Complementing breeding effort by deploying alternative methods of identifying higher yielding genotypes in a wheat breeding program is important for obtaining greater genetic gains. Spectral reflectance indices (SRI) are one of the many indirect selection tools that have been reported to be associated with different physiological process of wheat. A total of five experiments (a set of 25 released cultivars from winter wheat breeding programs of the U.S. Great Plains and four populations of randomly derived recombinant inbred lines having 25 entries in each population) were conducted in two years under Great Plains winter wheat rainfed environments at Oklahoma State University research farms. Grain yield was measured in each experiment and biomass was measured in three experiments at three growth stages (booting, heading, and grainfilling). Canopy spectral reflectance was measured at three growth stages and eleven SRI were calculated. Correlation (phenotypic and genetic) between grain yield and SRI, biomass and SRI, heritability (broad sense) of the SRI and yield, response to selection and correlated response, relative selection efficiency of the SRI, and efficiency in selecting the higher yielding genotypes by the SRI were assessed. Findings and conclusions. The genetic correlation coefficients revealed that the water based near infrared indices (WI and NWI) were strongly associated with grain yield and biomass production. The regression analysis detected a linear relationship between the water based indices with grain yield and biomass. The two newly developed indices (NWI-3 and NWI-4) gave higher broad sense heritability than grain yield, higher direct response to selection compared to grain yield, correlated response equal to or higher than direct response for grain yield, relative selection efficiency greater than one, and higher efficiency in selecting higher yielding genotypes. Based on the overall genetic analysis required to establish any trait as an efficient indirect selection tool, the water based SRI (especially NWI-3 and NWI-4) have the potential to complement the classical breeding effort for selecting genotypes with higher yield potential in a winter wheat breeding program.

  9. Langley Ground Facilities and Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Ambur, Damodar R.; Kegelman, Jerome T.; Kilgore, William A.

    2010-01-01

    A strategic approach for retaining and more efficiently operating the essential Langley Ground Testing Facilities in the 21st Century is presented. This effort takes advantage of the previously completed and ongoing studies at the Agency and National levels. This integrated approach takes into consideration the overall decline in test business base within the nation and reduced utilization in each of the Langley facilities with capabilities to test in the subsonic, transonic, supersonic, and hypersonic speed regimes. The strategy accounts for capability needs to meet the Agency programmatic requirements and strategic goals and to execute test activities in the most efficient and flexible facility operating structure. The structure currently being implemented at Langley offers agility to right-size our capability and capacity from a national perspective, to accommodate the dynamic nature of the testing needs, and will address the influence of existing and emerging analytical tools for design. The paradigm for testing in the retained facilities is to efficiently and reliably provide more accurate and high-quality test results at an affordable cost to support design information needs for flight regimes where the computational capability is not adequate and to verify and validate the existing and emerging computational tools. Each of the above goals are planned to be achieved, keeping in mind the increasing small industry customer base engaged in developing unpiloted aerial vehicles and commercial space transportation systems.

  10. 3Drefine: an interactive web server for efficient protein structure refinement.

    PubMed

    Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin

    2016-07-08

    3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Quality and Efficiency Improvement Tools for Every Radiologist.

    PubMed

    Kudla, Alexei U; Brook, Olga R

    2018-06-01

    In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  12. SAVANT: Solar Array Verification and Analysis Tool Demonstrated

    NASA Technical Reports Server (NTRS)

    Chock, Ricaurte

    2000-01-01

    The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.

  13. Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding

    NASA Astrophysics Data System (ADS)

    Güpner, Michael; Patschger, Andreas; Bliedtner, Jens

    Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.

  14. Identifying and Tracing User Needs

    NASA Astrophysics Data System (ADS)

    To, C.; Tauer, E.

    2017-12-01

    Providing adequate tools to the user community hinges on reaching the specific goals and needs behind the intended application of the tool. While the approach of leveraging user-supplied inputs and use cases to identify those goals is not new, there frequently remains the challenge of tracing those use cases through to implementation in an efficient and manageable fashion. Processes can become overcomplicated very quickly, and additionally, explicitly mapping progress towards the achievement of the user demands can become overwhelming when hundreds of use-cases are at play. This presentation will discuss a demonstrated use-case approach that has achieved an initial success with a tool re-design and deployment, the means to apply use cases in the generation of a roadmap for future releases over time, and the ability to include and adjust to new user requirements and suggestions with minimal disruption to the traceability. It is hoped that the findings and lessons learned will help make use case employment easier for others seeking to create user-targeted capabilities.

  15. Advanced Scientific Computing Research Exascale Requirements Review. An Office of Science review sponsored by Advanced Scientific Computing Research, September 27-29, 2016, Rockville, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almgren, Ann; DeMar, Phil; Vetter, Jeffrey

    The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less

  16. Distribution and Validation of CERES Irradiance Global Data Products Via Web Based Tools

    NASA Technical Reports Server (NTRS)

    Rutan, David; Mitrescu, Cristian; Doelling, David; Kato, Seiji

    2016-01-01

    The CERES SYN1deg product provides climate quality 3-hourly globally gridded and temporally complete maps of top of atmosphere, in atmosphere, and surface fluxes. This product requires efficient release to the public and validation to maintain quality assurance. The CERES team developed web-tools for the distribution of both the global gridded products and grid boxes that contain long term validation sites that maintain high quality flux observations at the Earth's surface. These are found at: http://ceres.larc.nasa.gov/order_data.php. In this poster we explore the various tools available to users to sub-set, download, and validate using surface observations the SYN1Deg and Surface-EBAF products. We also analyze differences found in long-term records from well-maintained land surface sites such as the ARM central facility and high quality buoy radiometers, which due to their isolated nature cannot be maintained in a similar manner to their land based counterparts.

  17. Challenges and methodology for indexing the computerized patient record.

    PubMed

    Ehrler, Frédéric; Ruch, Patrick; Geissbuhler, Antoine; Lovis, Christian

    2007-01-01

    Patient records contain most crucial documents for managing the treatments and healthcare of patients in the hospital. Retrieving information from these records in an easy, quick and safe way helps care providers to save time and find important facts about their patient's health. This paper presents the scalability issues induced by the indexing and the retrieval of the information contained in the patient records. For this study, EasyIR, an information retrieval tool performing full text queries and retrieving the related documents has been used. An evaluation of the performance reveals that the indexing process suffers from overhead consequence of the particular structure of the patient records. Most IR tools are designed to manage very large numbers of documents in a single index whereas in our hypothesis, one index per record, which usually implies few documents, has been imposed. As the number of modifications and creations of patient records are significant in a day, using a specialized and efficient indexation tool is required.

  18. Genome engineering and plant breeding: impact on trait discovery and development.

    PubMed

    Nogué, Fabien; Mara, Kostlend; Collonnier, Cécile; Casacuberta, Josep M

    2016-07-01

    New tools for the precise modification of crops genes are now available for the engineering of new ideotypes. A future challenge in this emerging field of genome engineering is to develop efficient methods for allele mining. Genome engineering tools are now available in plants, including major crops, to modify in a predictable manner a given gene. These new techniques have a tremendous potential for a spectacular acceleration of the plant breeding process. Here, we discuss how genetic diversity has always been the raw material for breeders and how they have always taken advantage of the best available science to use, and when possible, increase, this genetic diversity. We will present why the advent of these new techniques gives to the breeders extremely powerful tools for crop breeding, but also why this will require the breeders and researchers to characterize the genes underlying this genetic diversity more precisely. Tackling these challenges should permit the engineering of optimized alleles assortments in an unprecedented and controlled way.

  19. CEOS visualization environment (COVE) tool for intercalibration of satellite instruments

    USGS Publications Warehouse

    Kessler, P.D.; Killough, B.D.; Gowda, S.; Williams, B.R.; Chander, G.; Qu, Min

    2013-01-01

    Increasingly, data from multiple instruments are used to gain a more complete understanding of land surface processes at a variety of scales. Intercalibration, comparison, and coordination of satellite instrument coverage areas is a critical effort of international and domestic space agencies and organizations. The Committee on Earth Observation Satellites Visualization Environment (COVE) is a suite of browser-based applications that leverage Google Earth to display past, present, and future satellite instrument coverage areas and coincident calibration opportunities. This forecasting and ground coverage analysis and visualization capability greatly benefits the remote sensing calibration community in preparation for multisatellite ground calibration campaigns or individual satellite calibration studies. COVE has been developed for use by a broad international community to improve the efficiency and efficacy of such calibration planning efforts, whether those efforts require past, present, or future predictions. This paper provides a brief overview of the COVE tool, its validation, accuracies, and limitations with emphasis on the applicability of this visualization tool for supporting ground field campaigns and intercalibration of satellite instruments.

  20. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  1. GEsture: an online hand-drawing tool for gene expression pattern search.

    PubMed

    Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning

    2018-01-01

    Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.

  2. Medication Reconciliation: Work Domain Ontology, prototype development, and a predictive model.

    PubMed

    Markowitz, Eliz; Bernstam, Elmer V; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System's and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load.

  3. Medication Reconciliation: Work Domain Ontology, Prototype Development, and a Predictive Model

    PubMed Central

    Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge; Zhang, Jiajie; Shneiderman, Ben; Plaisant, Catherine; Johnson, Todd R.

    2011-01-01

    Medication errors can result from administration inaccuracies at any point of care and are a major cause for concern. To develop a successful Medication Reconciliation (MR) tool, we believe it necessary to build a Work Domain Ontology (WDO) for the MR process. A WDO defines the explicit, abstract, implementation-independent description of the task by separating the task from work context, application technology, and cognitive architecture. We developed a prototype based upon the WDO and designed to adhere to standard principles of interface design. The prototype was compared to Legacy Health System’s and Pre-Admission Medication List Builder MR tools via a Keystroke-Level Model analysis for three MR tasks. The analysis found the prototype requires the fewest mental operations, completes tasks in the fewest steps, and completes tasks in the least amount of time. Accordingly, we believe that developing a MR tool, based upon the WDO and user interface guidelines, improves user efficiency and reduces cognitive load. PMID:22195146

  4. siGnum: graphical user interface for EMG signal analysis.

    PubMed

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future.

  5. The ChIP-Seq tools and web server: a resource for analyzing ChIP-seq and other types of genomic data.

    PubMed

    Ambrosini, Giovanna; Dreos, René; Kumar, Sunil; Bucher, Philipp

    2016-11-18

    ChIP-seq and related high-throughput chromatin profilig assays generate ever increasing volumes of highly valuable biological data. To make sense out of it, biologists need versatile, efficient and user-friendly tools for access, visualization and itegrative analysis of such data. Here we present the ChIP-Seq command line tools and web server, implementing basic algorithms for ChIP-seq data analysis starting with a read alignment file. The tools are optimized for memory-efficiency and speed thus allowing for processing of large data volumes on inexpensive hardware. The web interface provides access to a large database of public data. The ChIP-Seq tools have a modular and interoperable design in that the output from one application can serve as input to another one. Complex and innovative tasks can thus be achieved by running several tools in a cascade. The various ChIP-Seq command line tools and web services either complement or compare favorably to related bioinformatics resources in terms of computational efficiency, ease of access to public data and interoperability with other web-based tools. The ChIP-Seq server is accessible at http://ccg.vital-it.ch/chipseq/ .

  6. Economics of Agroforestry

    Treesearch

    D. Evan Mercer; Frederick W. Cubbage; Gregory E. Frey

    2014-01-01

    This chapter provides principles, literature and a case study about the economics of agroforestry. We examine necessary conditions for achieving efficiency in agroforestry system design and economic analysis tools for assessing efficiency and adoptability of agroforestry. The tools presented here (capital budgeting, linear progranuning, production frontier analysis...

  7. Advanced order management in ERM systems: the tic-tac-toe algorithm

    NASA Astrophysics Data System (ADS)

    Badell, Mariana; Fernandez, Elena; Puigjaner, Luis

    2000-10-01

    The concept behind improved enterprise resource planning systems (ERP) systems is the overall integration of the whole enterprise functionality into the management systems through financial links. Converting current software into real management decision tools requires crucial changes in the current approach to ERP systems. This evolution must be able to incorporate the technological achievements both properly and in time. The exploitation phase of plants needs an open web-based environment for collaborative business-engineering with on-line schedulers. Today's short lifecycles of products and processes require sharp and finely tuned management actions that must be guided by scheduling tools. Additionally, such actions must be able to keep track of money movements related to supply chain events. Thus, the necessary outputs require financial-production integration at the scheduling level as proposed in the new approach of enterprise management systems (ERM). Within this framework, the economical analysis of the due date policy and its optimization become essential to manage dynamically realistic and optimal delivery dates with price-time trade-off during the marketing activities. In this work we propose a scheduling tool with web-based interface conducted by autonomous agents when precise economic information relative to plant and business actions and their effects are provided. It aims to attain a better arrangement of the marketing and production events in order to face the bid/bargain process during e-commerce. Additionally, management systems require real time execution and an efficient transaction-oriented approach capable to dynamically adopt realistic and optimal actions to support marketing management. To this end the TicTacToe algorithm provides sequence optimization with acceptable tolerances in realistic time.

  8. Applications of Satellite Data to Support Improvements in Irrigation and Groundwater Management in California

    NASA Technical Reports Server (NTRS)

    Melton, Forrest S.

    2017-01-01

    In agricultural regions around the world, threats to water supplies from drought and groundwater depletion are driving increased demand for tools to advance agricultural water use efficiency and support sustainable groundwater management. Satellite mapping of evapotranspiration (ET) from irrigated agricultural lands can provide agricultural producers and water resource managers with information that can be used to both optimize ag water use and improve estimates of groundwater withdrawals for irrigation. We describe the development of two remote sensing-based tools for ET mapping in California, including important lessons in terms of system design, partnership development, and transition to operations. For irrigation management, the integration of satellite data and surface sensor networks to provide timely delivery of information on crop water requirements can make irrigation scheduling more practical, convenient, and accurate.Developed through a partnership between NASA and the CA Department of Water Resources, the Satellite Irrigation Management Support (SIMS) framework integrates satellite data with information from agricultural weather networks to map crop canopy development and crop water requirements at the scale of individual fields. Information is distributed to agricultural producers and water managers via a web-based interface and web data services. SIMS also provides an API that facilitates integration with other irrigation decision support tools, such as CropManage and IrriQuest. Field trials using these integrated tools have shown that they can be used to sustain yields while improving water use efficiency and nutrient management. For sustainable groundwater management, the combination of satellite-derived estimates of ET and data on surface water deliveries for irrigation can increase the accuracy of estimates of groundwater pumping. We are developing an OpenET platform to facilitate access to ET data from multiple models and accelerate operational use of ET data in support of a range of water management applications, including implementation of the Sustainable Groundwater Management Act in CA. By providing a shared basis for decision making, we anticipate that the OpenET platform will accelerate implementation of solutions for sustainable groundwater management.

  9. Applications of Satellite Data to Support Improvements in Irrigation and Groundwater Management in California

    NASA Astrophysics Data System (ADS)

    Melton, F. S.; Huntington, J. L.; Johnson, L.; Guzman, A.; Morton, C.; Zaragoza, I.; Dexter, J.; Rosevelt, C.; Michaelis, A.; Nemani, R. R.; Cahn, M.; Temesgen, B.; Trezza, R.; Frame, K.; Eching, S.; Grimm, R.; Hall, M.

    2017-12-01

    In agricultural regions around the world, threats to water supplies from drought and groundwater depletion are driving increased demand for tools to advance agricultural water use efficiency and support sustainable groundwater management. Satellite mapping of evapotranspiration (ET) from irrigated agricultural lands can provide agricultural producers and water resource managers with information that can be used to both optimize ag water use and improve estimates of groundwater withdrawals for irrigation. We describe the development of two remote sensing-based tools for ET mapping in California, including important lessons in terms of system design, partnership development, and transition to operations. For irrigation management, the integration of satellite data and surface sensor networks to provide timely delivery of information on crop water requirements can make irrigation scheduling more practical, convenient, and accurate. Developed through a partnership between NASA and the CA Department of Water Resources, the Satellite Irrigation Management Support (SIMS) framework integrates satellite data with information from agricultural weather networks to map crop canopy development and crop water requirements at the scale of individual fields. Information is distributed to agricultural producers and water managers via a web-based interface and web data services. SIMS also provides an API that facilitates integration with other irrigation decision support tools, such as CropManage and IrriQuest. Field trials using these integrated tools have shown that they can be used to sustain yields while improving water use efficiency and nutrient management. For sustainable groundwater management, the combination of satellite-derived estimates of ET and data on surface water deliveries for irrigation can increase the accuracy of estimates of groundwater pumping. We are developing an OpenET platform to facilitate access to ET data from multiple models and accelerate operational use of ET data in support of a range of water management applications, including implementation of the Sustainable Groundwater Management Act in CA. By providing a shared basis for decision making, we anticipate that the OpenET platform will accelerate implementation of solutions for sustainable groundwater management.

  10. CRISPR-Cpf1: A New Tool for Plant Genome Editing.

    PubMed

    Zaidi, Syed Shan-E-Ali; Mahfouz, Magdy M; Mansoor, Shahid

    2017-07-01

    Clustered regularly interspaced palindromic repeats (CRISPR)-CRISPR-associated proteins (CRISPR-Cas), a groundbreaking genome-engineering tool, has facilitated targeted trait improvement in plants. Recently, CRISPR-CRISPR from Prevotella and Francisella 1 (Cpf1) has emerged as a new tool for efficient genome editing, including DNA-free editing in plants, with higher efficiency, specificity, and potentially wider applications than CRISPR-Cas9. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Transforming Functional Requirements from UML into BPEL to Efficiently Develop SOA-Based Systems

    NASA Astrophysics Data System (ADS)

    Vemulapalli, Anisha; Subramanian, Nary

    The intended behavior of any system such as services, tasks or functions can be captured by functional requirements of the system. As our dependence on online services has grown steadily, the web applications are being developed employing the SOA. BPEL4WS provides a means for expressing functional requirements of an SOA-based system by providing constructs to capture business goals and objectives for the system. In this paper we propose an approach for transforming user-centered requirements captured using UML into a corresponding BPEL specification, where the business processes are captured by means of use-cases from which UML sequence diagrams and activity diagrams are extracted. Subsequently these UML models are mapped to BPEL specifications that capture the essence of the initial business requirements to develop the SOA-based system by employing CASE tools. A student housing system is used as a case study to illustrate this approach and the system is validated using NetBeans.

  12. A Transparent Translation from Legacy System Model into Common Information Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Simpson, Jeffrey; Zhang, Yingchen

    Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms andmore » applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.« less

  13. Calibration of 3D ALE finite element model from experiments on friction stir welding of lap joints

    NASA Astrophysics Data System (ADS)

    Fourment, Lionel; Gastebois, Sabrina; Dubourg, Laurent

    2016-10-01

    In order to support the design of such a complex process like Friction Stir Welding (FSW) for the aeronautic industry, numerical simulation software requires (1) developing an efficient and accurate Finite Element (F.E.) formulation that allows predicting welding defects, (2) properly modeling the thermo-mechanical complexity of the FSW process and (3) calibrating the F.E. model from accurate measurements from FSW experiments. This work uses a parallel ALE formulation developed in the Forge® F.E. code to model the different possible defects (flashes and worm holes), while pin and shoulder threads are modeled by a new friction law at the tool / material interface. FSW experiments require using a complex tool with scroll on shoulder, which is instrumented for providing sensitive thermal data close to the joint. Calibration of unknown material thermal coefficients, constitutive equations parameters and friction model from measured forces, torques and temperatures is carried out using two F.E. models, Eulerian and ALE, to reach a satisfactory agreement assessed by the proper sensitivity of the simulation to process parameters.

  14. A Collaborative Web-Based Approach to Planning Research, Integration, and Testing Using a Wiki

    NASA Technical Reports Server (NTRS)

    Delaney, Michael M.; Koshimoto, Edwin T.; Noble, Deleena; Duggan, Christopher

    2010-01-01

    The National Aeronautics and Space Administration Integrated Vehicle Health Management program touches on many different research areas while striving to enable the automated detection, diagnosis, prognosis, and mitigation of adverse events at the aircraft and system level. At the system level, the research focus is on the evaluation of multidisciplinary integrated methods, tools, and technologies for achieving the program goal. The participating program members form a diverse group of government, industry, and academic researchers. The program team developed the Research and Test Integration Plan in order to track significant test and evaluation activities, which are important for understanding, demonstrating, and communicating the overall project state and project direction. The Plan is a living document, which allows the project team the flexibility to construct conceptual test scenarios and to track project resources. The Plan also incorporates several desirable feature requirements for Plan users and maintainers. A wiki has proven to be the most efficient and effective means of implementing the feature requirements for the Plan. The wiki has proven very valuable as a research project management tool, and there are plans to expand its scope.

  15. Design and implementation of visualization methods for the CHANGES Spatial Decision Support System

    NASA Astrophysics Data System (ADS)

    Cristal, Irina; van Westen, Cees; Bakker, Wim; Greiving, Stefan

    2014-05-01

    The CHANGES Spatial Decision Support System (SDSS) is a web-based system aimed for risk assessment and the evaluation of optimal risk reduction alternatives at local level as a decision support tool in long-term natural risk management. The SDSS use multidimensional information, integrating thematic, spatial, temporal and documentary data. The role of visualization in this context becomes of vital importance for efficiently representing each dimension. This multidimensional aspect of the required for the system risk information, combined with the diversity of the end-users imposes the use of sophisticated visualization methods and tools. The key goal of the present work is to exploit efficiently the large amount of data in relation to the needs of the end-user, utilizing proper visualization techniques. Three main tasks have been accomplished for this purpose: categorization of the end-users, the definition of system's modules and the data definition. The graphical representation of the data and the visualization tools were designed to be relevant to the data type and the purpose of the analysis. Depending on the end-users category, each user should have access to different modules of the system and thus, to the proper visualization environment. The technologies used for the development of the visualization component combine the latest and most innovative open source JavaScript frameworks, such as OpenLayers 2.13.1, ExtJS 4 and GeoExt 2. Moreover, the model-view-controller (MVC) pattern is used in order to ensure flexibility of the system at the implementation level. Using the above technologies, the visualization techniques implemented so far offer interactive map navigation, querying and comparison tools. The map comparison tools are of great importance within the SDSS and include the following: swiping tool for comparison of different data of the same location; raster subtraction for comparison of the same phenomena varying in time; linked views for comparison of data from different locations and a time slider tool for monitoring changes in spatio-temporal data. All these techniques are part of the interactive interface of the system and make use of spatial and spatio-temporal data. Further significant aspects of the visualization component include conventional cartographic techniques and visualization of non-spatial data. The main expectation from the present work is to offer efficient visualization of risk-related data in order to facilitate the decision making process, which is the final purpose of the CHANGES SDSS. This work is part of the "CHANGES" project, funded by the European Community's 7th Framework Programme.

  16. Tools & Resources | Efficient Windows Collaborative

    Science.gov Websites

    Selection Tool Mobile App Window Selection Tool Mobile App Use the Window Selection Tool Mobile App for new Window Selection Tool Mobile App. LBNL's RESFEN RESFEN RESFEN is used for calculating the heating and

  17. Performance Analysis of the NAS Y-MP Workload

    NASA Technical Reports Server (NTRS)

    Bergeron, Robert J.; Kutler, Paul (Technical Monitor)

    1997-01-01

    This paper describes the performance characteristics of the computational workloads on the NAS Cray Y-MP machines, a Y-MP 832 and later a Y-MP 8128. Hardware measurements indicated that the Y-MP workload performance matured over time, ultimately sustaining an average throughput of 0.8 GFLOPS and a vector operation fraction of 87%. The measurements also revealed an operation rate exceeding 1 per clock period, a well-balanced architecture featuring a strong utilization of vector functional units, and an efficient memory organization. Introduction of the larger memory 8128 increased throughput by allowing a more efficient utilization of CPUs. Throughput also depended on the metering of the batch queues; low-idle Saturday workloads required a buffer of small jobs to prevent memory starvation of the CPU. UNICOS required about 7% of total CPU time to service the 832 workloads; this overhead decreased to 5% for the 8128 workloads. While most of the system time went to service I/O requests, efficient scheduling prevented excessive idle due to I/O wait. System measurements disclosed no obvious bottlenecks in the response of the machine and UNICOS to the workloads. In most cases, Cray-provided software tools were- quite sufficient for measuring the performance of both the machine and operating, system.

  18. Efficient temporal and interlayer parameter prediction for weighted prediction in scalable high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Tsang, Sik-Ho; Chan, Yui-Lam; Siu, Wan-Chi

    2017-01-01

    Weighted prediction (WP) is an efficient video coding tool that was introduced since the establishment of the H.264/AVC video coding standard, for compensating the temporal illumination change in motion estimation and compensation. WP parameters, including a multiplicative weight and an additive offset for each reference frame, are required to be estimated and transmitted to the decoder by slice header. These parameters cause extra bits in the coded video bitstream. High efficiency video coding (HEVC) provides WP parameter prediction to reduce the overhead. Therefore, WP parameter prediction is crucial to research works or applications, which are related to WP. Prior art has been suggested to further improve the WP parameter prediction by implicit prediction of image characteristics and derivation of parameters. By exploiting both temporal and interlayer redundancies, we propose three WP parameter prediction algorithms, enhanced implicit WP parameter, enhanced direct WP parameter derivation, and interlayer WP parameter, to further improve the coding efficiency of HEVC. Results show that our proposed algorithms can achieve up to 5.83% and 5.23% bitrate reduction compared to the conventional scalable HEVC in the base layer for SNR scalability and 2× spatial scalability, respectively.

  19. Fundamental Aeronautics Program: Overview of Project Work in Supersonic Cruise Efficiency

    NASA Technical Reports Server (NTRS)

    Castner, Raymond

    2011-01-01

    The Supersonics Project, part of NASA?s Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2011) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  20. Fundamental Aeronautics Program: Overview of Propulsion Work in the Supersonic Cruise Efficiency Technical Challenge

    NASA Technical Reports Server (NTRS)

    Castner, Ray

    2012-01-01

    The Supersonics Project, part of NASA's Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2012) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  1. Application of linear regression analysis in accuracy assessment of rolling force calculations

    NASA Astrophysics Data System (ADS)

    Poliak, E. I.; Shim, M. K.; Kim, G. S.; Choo, W. Y.

    1998-10-01

    Efficient operation of the computational models employed in process control systems require periodical assessment of the accuracy of their predictions. Linear regression is proposed as a tool which allows separate systematic and random prediction errors from those related to measurements. A quantitative characteristic of the model predictive ability is introduced in addition to standard statistical tests for model adequacy. Rolling force calculations are considered as an example for the application. However, the outlined approach can be used to assess the performance of any computational model.

  2. The Make 2D-DB II package: conversion of federated two-dimensional gel electrophoresis databases into a relational format and interconnection of distributed databases.

    PubMed

    Mostaguir, Khaled; Hoogland, Christine; Binz, Pierre-Alain; Appel, Ron D

    2003-08-01

    The Make 2D-DB tool has been previously developed to help build federated two-dimensional gel electrophoresis (2-DE) databases on one's own web site. The purpose of our work is to extend the strength of the first package and to build a more efficient environment. Such an environment should be able to fulfill the different needs and requirements arising from both the growing use of 2-DE techniques and the increasing amount of distributed experimental data.

  3. Rapid Geometry Creation for Computer-Aided Engineering Parametric Analyses: A Case Study Using ComGeom2 for Launch Abort System Design

    NASA Technical Reports Server (NTRS)

    Hawke, Veronica; Gage, Peter; Manning, Ted

    2007-01-01

    ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.

  4. Effective communication and teamwork promotes patient safety.

    PubMed

    Gluyas, Heather

    2015-08-05

    Teamwork requires co-operation, co-ordination and communication between members of a team to achieve desired outcomes. In industries with a high degree of risk, such as health care, effective teamwork has been shown to achieve team goals successfully and efficiently, with fewer errors. This article introduces behaviours that support communication, co-operation and co-ordination in teams. The central role of communication in enabling co-operation and co-ordination is explored. A human factors perspective is used to examine tools to improve communication and identify barriers to effective team communication in health care.

  5. Microneedles: quick and easy delivery methods of vaccines

    PubMed Central

    2017-01-01

    Vaccination is the most efficient method for infectious disease prevention. Parenteral injections such as intramuscular, intradermal, and subcutaneous injections have several advantages in vaccine delivery, but there are many drawbacks. Thus, the development of a new vaccine delivery system has long been required. Recently, microneedles have been attracting attention as new vaccination tools. Microneedle is a highly effective transdermal vaccine delivery method due to its mechanism of action, painlessness, and ease of use. Here, we summarized the characteristics of microneedles and the possibilities as a new vaccine delivery route. PMID:28775980

  6. Quantum tomography of near-unitary processes in high-dimensional quantum systems

    NASA Astrophysics Data System (ADS)

    Lysne, Nathan; Sosa Martinez, Hector; Jessen, Poul; Baldwin, Charles; Kalev, Amir; Deutsch, Ivan

    2016-05-01

    Quantum Tomography (QT) is often considered the ideal tool for experimental debugging of quantum devices, capable of delivering complete information about quantum states (QST) or processes (QPT). In practice, the protocols used for QT are resource intensive and scale poorly with system size. In this situation, a well behaved model system with access to large state spaces (qudits) can serve as a useful platform for examining the tradeoffs between resource cost and accuracy inherent in QT. In past years we have developed one such experimental testbed, consisting of the electron-nuclear spins in the electronic ground state of individual Cs atoms. Our available toolkit includes high fidelity state preparation, complete unitary control, arbitrary orthogonal measurements, and accurate and efficient QST in Hilbert space dimensions up to d = 16. Using these tools, we have recently completed a comprehensive study of QPT in 4, 7 and 16 dimensions. Our results show that QPT of near-unitary processes is quite feasible if one chooses optimal input states and efficient QST on the outputs. We further show that for unitary processes in high dimensional spaces, one can use informationally incomplete QPT to achieve high-fidelity process reconstruction (90% in d = 16) with greatly reduced resource requirements.

  7. Taxi Time Prediction at Charlotte Airport Using Fast-Time Simulation and Machine Learning Techniques

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2016-01-01

    Accurate taxi time prediction is required for enabling efficient runway scheduling that can increase runway throughput and reduce taxi times and fuel consumptions on the airport surface. Currently NASA and American Airlines are jointly developing a decision-support tool called Spot and Runway Departure Advisor (SARDA) that assists airport ramp controllers to make gate pushback decisions and improve the overall efficiency of airport surface traffic. In this presentation, we propose to use Linear Optimized Sequencing (LINOS), a discrete-event fast-time simulation tool, to predict taxi times and provide the estimates to the runway scheduler in real-time airport operations. To assess its prediction accuracy, we also introduce a data-driven analytical method using machine learning techniques. These two taxi time prediction methods are evaluated with actual taxi time data obtained from the SARDA human-in-the-loop (HITL) simulation for Charlotte Douglas International Airport (CLT) using various performance measurement metrics. Based on the taxi time prediction results, we also discuss how the prediction accuracy can be affected by the operational complexity at this airport and how we can improve the fast time simulation model before implementing it with an airport scheduling algorithm in a real-time environment.

  8. GlycCompSoft: Software for Automated Comparison of Low Molecular Weight Heparins Using Top-Down LC/MS Data

    PubMed Central

    Li, Lingyun; Zhang, Fuming; Hu, Min; Ren, Fuji; Chi, Lianli; Linhardt, Robert J.

    2016-01-01

    Low molecular weight heparins are complex polycomponent drugs that have recently become amenable to top-down analysis using liquid chromatography-mass spectrometry. Even using open source deconvolution software, DeconTools, and automatic structural assignment software, GlycReSoft, the comparison of two or more low molecular weight heparins is extremely time-consuming, taking about a week for an expert analyst and provides no guarantee of accuracy. Efficient data processing tools are required to improve analysis. This study uses the programming language of Microsoft Excel™ Visual Basic for Applications to extend its standard functionality for macro functions and specific mathematical modules for mass spectrometric data processing. The program developed enables the comparison of top-down analytical glycomics data on two or more low molecular weight heparins. The current study describes a new program, GlycCompSoft, which has a low error rate with good time efficiency in the automatic processing of large data sets. The experimental results based on three lots of Lovenox®, Clexane® and three generic enoxaparin samples show that the run time of GlycCompSoft decreases from 11 to 2 seconds when the data processed decreases from 18000 to 1500 rows. PMID:27942011

  9. Minimalist Design of Allosterically Regulated Protein Catalysts.

    PubMed

    Makhlynets, O V; Korendovych, I V

    2016-01-01

    Nature facilitates chemical transformations with exceptional selectivity and efficiency. Despite a tremendous progress in understanding and predicting protein function, the overall problem of designing a protein catalyst for a given chemical transformation is far from solved. Over the years, many design techniques with various degrees of complexity and rational input have been developed. Minimalist approach to protein design that focuses on the bare minimum requirements to achieve activity presents several important advantages. By focusing on basic physicochemical properties and strategic placing of only few highly active residues one can feasibly evaluate in silico a very large variety of possible catalysts. In more general terms minimalist approach looks for the mere possibility of catalysis, rather than trying to identify the most active catalyst possible. Even very basic designs that utilize a single residue introduced into nonenzymatic proteins or peptide bundles are surprisingly active. Because of the inherent simplicity of the minimalist approach computational tools greatly enhance its efficiency. No complex calculations need to be set up and even a beginner can master this technique in a very short time. Here, we present a step-by-step protocol for minimalist design of functional proteins using basic, easily available, and free computational tools. © 2016 Elsevier Inc. All rights reserved.

  10. Mathematical modeling of unicellular microalgae and cyanobacteria metabolism for biofuel production.

    PubMed

    Baroukh, Caroline; Muñoz-Tamayo, Rafael; Bernard, Olivier; Steyer, Jean-Philippe

    2015-06-01

    The conversion of microalgae lipids and cyanobacteria carbohydrates into biofuels appears to be a promising source of renewable energy. This requires a thorough understanding of their carbon metabolism, supported by mathematical models, in order to optimize biofuel production. However, unlike heterotrophic microorganisms that utilize the same substrate as sources of energy and carbon, photoautotrophic microorganisms require light for energy and CO2 as carbon source. Furthermore, they are submitted to permanent fluctuating light environments due to outdoor cultivation or mixing inducing a flashing effect. Although, modeling these nonstandard organisms is a major challenge for which classical tools are often inadequate, this step remains a prerequisite towards efficient optimization of outdoor biofuel production at an industrial scale. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Modelling excitonic-energy transfer in light-harvesting complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramer, Tobias; Kreisbeck, Christoph

    The theoretical and experimental study of energy transfer in photosynthesis has revealed an interesting transport regime, which lies at the borderline between classical transport dynamics and quantum-mechanical interference effects. Dissipation is caused by the coupling of electronic degrees of freedom to vibrational modes and leads to a directional energy transfer from the antenna complex to the target reaction-center. The dissipative driving is robust and does not rely on fine-tuning of specific vibrational modes. For the parameter regime encountered in the biological systems new theoretical tools are required to directly compare theoretical results with experimental spectroscopy data. The calculations require tomore » utilize massively parallel graphics processor units (GPUs) for efficient and exact computations.« less

  12. Classroom evaluation of the Arlyn Arm robotic workstation.

    PubMed

    Eberhardt, S P; Osborne, J; Rahman, T

    2000-01-01

    High school and junior high school students with neuromuscular weakness and other disorders of the arms evaluated a recently commercialized robotic workstation, the Arlyn Arm, to carry out art projects and science experiments. These tasks were designed for independent execution with the workstation using standard or custom-designed tools. Each task was divided into subtasks, and the execution time of each subtask was determined as a measure of efficiency. Special attention was given to the causes of required experimenter intervention. While subjects easily accomplished some subtasks, others required considerable intervention. Most of these interventions could be avoided by further customizing accessories. It is concluded that the Arlyn Arm workstation could be of considerable benefit in a classroom setting to persons with severe neuromuscular disorders.

  13. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with amore » better control over the spurious fragments in the image.« less

  14. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  15. Picometer-resolution dual-comb spectroscopy with a free-running fiber laser.

    PubMed

    Zhao, Xin; Hu, Guoqing; Zhao, Bofeng; Li, Cui; Pan, Yingling; Liu, Ya; Yasui, Takeshi; Zheng, Zheng

    2016-09-19

    Dual-comb spectroscopy holds the promise as real-time, high-resolution spectroscopy tools. However, in its conventional schemes, the stringent requirement on the coherence between two lasers requires sophisticated control systems. By replacing control electronics with an all-optical dual-comb lasing scheme, a simplified dual-comb spectroscopy scheme is demonstrated using one dual-wavelength, passively mode-locked fiber laser. Pulses with a intracavity-dispersion-determined repetition-frequency difference are shown to have good mutual coherence and stability. Capability to resolve the comb teeth and a picometer-wide optical spectral resolution are demonstrated using a simple data acquisition system. Energy-efficient, free-running fiber lasers with a small comb-tooth-spacing could enable low-cost dual-comb systems.

  16. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  17. Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon B.; Schlierf, Roland; Miller, Darcy; Posada, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderon, Gena; Barth, Tim

    2011-01-01

    This slide presentation reviews the use of Human factors and timeline analysis to have a more efficient and effective processing flow. The solution involved developing a written timeline of events that included each activity within each functional flow block. Each activity had computer animation videos and pictures of the people involved and the hardware. The Human Factors Engineering Analysis Tool (HFEAT) was improved by modifying it to include the timeline of events. The HFEAT was used to define the human factors requirements and design solutions were developed for these requirements. An example of a functional flow block diagram is shown, and a view from one of the animations (i.e., short stack pallet) is shown and explained.

  18. Applying CBR to machine tool product configuration design oriented to customer requirements

    NASA Astrophysics Data System (ADS)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  19. Design Change Model for Effective Scheduling Change Propagation Paths

    NASA Astrophysics Data System (ADS)

    Zhang, Hai-Zhu; Ding, Guo-Fu; Li, Rong; Qin, Sheng-Feng; Yan, Kai-Yin

    2017-09-01

    Changes in requirements may result in the increasing of product development project cost and lead time, therefore, it is important to understand how requirement changes propagate in the design of complex product systems and be able to select best options to guide design. Currently, a most approach for design change is lack of take the multi-disciplinary coupling relationships and the number of parameters into account integrally. A new design change model is presented to systematically analyze and search change propagation paths. Firstly, a PDS-Behavior-Structure-based design change model is established to describe requirement changes causing the design change propagation in behavior and structure domains. Secondly, a multi-disciplinary oriented behavior matrix is utilized to support change propagation analysis of complex product systems, and the interaction relationships of the matrix elements are used to obtain an initial set of change paths. Finally, a rough set-based propagation space reducing tool is developed to assist in narrowing change propagation paths by computing the importance of the design change parameters. The proposed new design change model and its associated tools have been demonstrated by the scheduling change propagation paths of high speed train's bogie to show its feasibility and effectiveness. This model is not only supportive to response quickly to diversified market requirements, but also helpful to satisfy customer requirements and reduce product development lead time. The proposed new design change model can be applied in a wide range of engineering systems design with improved efficiency.

  20. Development of a problem solving evaluation instrument; untangling of specific problem solving assets

    NASA Astrophysics Data System (ADS)

    Adams, Wendy Kristine

    The purpose of my research was to produce a problem solving evaluation tool for physics. To do this it was necessary to gain a thorough understanding of how students solve problems. Although physics educators highly value problem solving and have put extensive effort into understanding successful problem solving, there is currently no efficient way to evaluate problem solving skill. Attempts have been made in the past; however, knowledge of the principles required to solve the subject problem are so absolutely critical that they completely overshadow any other skills students may use when solving a problem. The work presented here is unique because the evaluation tool removes the requirement that the student already have a grasp of physics concepts. It is also unique because I picked a wide range of people and picked a wide range of tasks for evaluation. This is an important design feature that helps make things emerge more clearly. This dissertation includes an extensive literature review of problem solving in physics, math, education and cognitive science as well as descriptions of studies involving student use of interactive computer simulations, the design and validation of a beliefs about physics survey and finally the design of the problem solving evaluation tool. I have successfully developed and validated a problem solving evaluation tool that identifies 44 separate assets (skills) necessary for solving problems. Rigorous validation studies, including work with an independent interviewer, show these assets identified by this content-free evaluation tool are the same assets that students use to solve problems in mechanics and quantum mechanics. Understanding this set of component assets will help teachers and researchers address problem solving within the classroom.

  1. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    PubMed

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  2. Re-engineering the Federal planning process: A total Federal planning strategy, integrating NEPA with modern management tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eccleston, C.H.

    1997-09-05

    The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less

  3. Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.

    PubMed

    Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin

    2016-02-15

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. A modified indirect mathematical model for evaluation of ethanol production efficiency in industrial-scale continuous fermentation processes.

    PubMed

    Canseco Grellet, M A; Castagnaro, A; Dantur, K I; De Boeck, G; Ahmed, P M; Cárdenas, G J; Welin, B; Ruiz, R M

    2016-10-01

    To calculate fermentation efficiency in a continuous ethanol production process, we aimed to develop a robust mathematical method based on the analysis of metabolic by-product formation. This method is in contrast to the traditional way of calculating ethanol fermentation efficiency, where the ratio between the ethanol produced and the sugar consumed is expressed as a percentage of the theoretical conversion yield. Comparison between the two methods, at industrial scale and in sensitivity studies, showed that the indirect method was more robust and gave slightly higher fermentation efficiency values, although fermentation efficiency of the industrial process was found to be low (~75%). The traditional calculation method is simpler than the indirect method as it only requires a few chemical determinations in samples collected. However, a minor error in any measured parameter will have an important impact on the calculated efficiency. In contrast, the indirect method of calculation requires a greater number of determinations but is much more robust since an error in any parameter will only have a minor effect on the fermentation efficiency value. The application of the indirect calculation methodology in order to evaluate the real situation of the process and to reach an optimum fermentation yield for an industrial-scale ethanol production is recommended. Once a high fermentation yield has been reached the traditional method should be used to maintain the control of the process. Upon detection of lower yields in an optimized process the indirect method should be employed as it permits a more accurate diagnosis of causes of yield losses in order to correct the problem rapidly. The low fermentation efficiency obtained in this study shows an urgent need for industrial process optimization where the indirect calculation methodology will be an important tool to determine process losses. © 2016 The Society for Applied Microbiology.

  5. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  6. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  7. Advances in biotechnology and genomics of switchgrass

    PubMed Central

    2013-01-01

    Switchgrass (Panicum virgatum L.) is a C4 perennial warm season grass indigenous to the North American tallgrass prairie. A number of its natural and agronomic traits, including adaptation to a wide geographical distribution, low nutrient requirements and production costs, high water use efficiency, high biomass potential, ease of harvesting, and potential for carbon storage, make it an attractive dedicated biomass crop for biofuel production. We believe that genetic improvements using biotechnology will be important to realize the potential of the biomass and biofuel-related uses of switchgrass. Tissue culture techniques aimed at rapid propagation of switchgrass and genetic transformation protocols have been developed. Rapid progress in genome sequencing and bioinformatics has provided efficient strategies to identify, tag, clone and manipulate many economically-important genes, including those related to higher biomass, saccharification efficiency, and lignin biosynthesis. Application of the best genetic tools should render improved switchgrass that will be more economically and environmentally sustainable as a lignocellulosic bioenergy feedstock. PMID:23663491

  8. PMO Delivery System Using Bubble Liposomes and Ultrasound Exposure for Duchenne Muscular Dystrophy Treatment.

    PubMed

    Negishi, Yoichi; Ishii, Yuko; Nirasawa, Kei; Sasaki, Eri; Endo-Takahashi, Yoko; Suzuki, Ryo; Maruyama, Kazuo

    2018-01-01

    Duchenne muscular dystrophy (DMD) is a genetic disorder characterized by progressive muscle degeneration, caused by nonsense or frameshift mutations in the dystrophin (DMD) gene. Antisense oligonucleotides can be used to induce specific exon skipping; recently, a phosphorodiamidate morpholino oligomer (PMO) has been approved for clinical use in DMD. However, an efficient PMO delivery strategy is required to improve the therapeutic efficacy in DMD patients. We previously developed polyethylene glycol (PEG)-modified liposomes containing ultrasound contrast gas, "Bubble liposomes" (BLs), and found that the combination of BLs with ultrasound exposure is a useful gene delivery tool. Here, we describe an efficient PMO delivery strategy using the combination of BLs and ultrasound exposure to treat muscles in a DMD mouse model (mdx). This ultrasound-mediated BL technique can increase the PMO-mediated exon-skipping efficiency, leading to significantly increased dystrophin expression. Thus, the combination of BLs and ultrasound exposure may be a feasible PMO delivery method to improve therapeutic efficacy and reduce the PMO dosage for DMD treatment.

  9. Using RFID Positioning Technology to Construct an Automatic Rehabilitation Scheduling Mechanism.

    PubMed

    Wang, Ching-Sheng; Hung, Lun-Ping; Yen, Neil Y

    2016-01-01

    Accurately and efficiently identifying the location of patients during the course of rehabilitation is an important issue. Wireless transmission technology can reach this goal. Tracking technologies such as RFID (Radio frequency identification) can support process improvement and improve efficiencies of rehabilitation. There are few published models or methods to solve the problem of positioning and apply this technology in the rehabilitation center. We propose a mechanism to enhance the accuracy of positioning technology and provide information about turns and obstacles on the path; and user-centered services based on location-aware to enhanced quality care in rehabilitation environment. This paper outlines the requirements and the role of RFID in assisting rehabilitation environment. A prototype RFID hospital support tool is established. It is designed to provide assistance for monitoring rehabilitation patients. It can simultaneously calculate the rehabilitant's location and the duration of treatment, and automatically record the rehabilitation course of the rehabilitant, so as to improve the management efficiency of the rehabilitation program.

  10. Multi-objective Optimization of a Solar Humidification Dehumidification Desalination Unit

    NASA Astrophysics Data System (ADS)

    Rafigh, M.; Mirzaeian, M.; Najafi, B.; Rinaldi, F.; Marchesi, R.

    2017-11-01

    In the present paper, a humidification-dehumidification desalination unit integrated with solar system is considered. In the first step mathematical model of the whole plant is represented. Next, taking into account the logical constraints, the performance of the system is optimized. On one hand it is desired to have higher energetic efficiency, while on the other hand, higher efficiency results in an increment in the required area for each subsystem which consequently leads to an increase in the total cost of the plant. In the present work, the optimum solution is achieved when the specific energy of the solar heater and also the areas of humidifier and dehumidifier are minimized. Due to the fact that considered objective functions are in conflict, conventional optimization methods are not applicable. Hence, multi objective optimization using genetic algorithm which is an efficient tool for dealing with problems with conflicting objectives has been utilized and a set of optimal solutions called Pareto front each of which is a tradeoff between the mentioned objectives is generated.

  11. STOCHSIMGPU: parallel stochastic simulation for the Systems Biology Toolbox 2 for MATLAB.

    PubMed

    Klingbeil, Guido; Erban, Radek; Giles, Mike; Maini, Philip K

    2011-04-15

    The importance of stochasticity in biological systems is becoming increasingly recognized and the computational cost of biologically realistic stochastic simulations urgently requires development of efficient software. We present a new software tool STOCHSIMGPU that exploits graphics processing units (GPUs) for parallel stochastic simulations of biological/chemical reaction systems and show that significant gains in efficiency can be made. It is integrated into MATLAB and works with the Systems Biology Toolbox 2 (SBTOOLBOX2) for MATLAB. The GPU-based parallel implementation of the Gillespie stochastic simulation algorithm (SSA), the logarithmic direct method (LDM) and the next reaction method (NRM) is approximately 85 times faster than the sequential implementation of the NRM on a central processing unit (CPU). Using our software does not require any changes to the user's models, since it acts as a direct replacement of the stochastic simulation software of the SBTOOLBOX2. The software is open source under the GPL v3 and available at http://www.maths.ox.ac.uk/cmb/STOCHSIMGPU. The web site also contains supplementary information. klingbeil@maths.ox.ac.uk Supplementary data are available at Bioinformatics online.

  12. A review of AirQ Models and their applications for forecasting the air pollution health outcomes.

    PubMed

    Oliveri Conti, Gea; Heibati, Behzad; Kloog, Itai; Fiore, Maria; Ferrante, Margherita

    2017-03-01

    Even though clean air is considered as a basic requirement for the maintenance of human health, air pollution continues to pose a significant health threat in developed and developing countries alike. Monitoring and modeling of classic and emerging pollutants is vital to our knowledge of health outcomes in exposed subjects and to our ability to predict them. The ability to anticipate and manage changes in atmospheric pollutant concentrations relies on an accurate representation of the chemical state of the atmosphere. The task of providing the best possible analysis of air pollution thus requires efficient computational tools enabling efficient integration of observational data into models. A number of air quality models have been developed and play an important role in air quality management. Even though a large number of air quality models have been discussed or applied, their heterogeneity makes it difficult to select one approach above the others. This paper provides a brief review on air quality models with respect to several aspects such as prediction of health effects.

  13. [Services portfolio of a department of endocrinology and clinical nutrition].

    PubMed

    Vicente Delgado, Almudena; Gómez Enterría, Pilar; Tinahones Madueño, Francisco

    2011-03-01

    Endocrinology and Clinical Nutrition are branches of Medicine that deal with the study of physiology of body glands and hormones and their disorders, intermediate metabolism of nutrients, enteral and parenteral nutrition, promotion of health by prevention of diet-related diseases, and appropriate use of the diagnostic, therapeutic, and preventive tools related to these disciplines. Development of Endocrinology and Clinical Nutrition support services requires accurate definition and management of a number of complex resources, both human and material, as well as adequate planning of the care provided. It is therefore essential to know the services portfolio of an ideal Department of Endocrinology and Clinical Nutrition because this is a useful, valid and necessary tool to optimize the available resources, to increase efficiency, and to improve the quality of care. Copyright © 2010 SEEN. Published by Elsevier Espana. All rights reserved.

  14. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  15. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  16. Residence as a Diagnostic and Therapeutic Area - A Smart Home Approach.

    PubMed

    Mielke, Corinna; Voss, Thorsten; Haux, Reinhold

    2017-01-01

    The "research apartment Halberstadtstraße" (HSS) in Braunschweig, Germany, is the attempt to realize a personal living environment as a room for diagnostics and therapy with the support of health-enabling and ambient assistive technologies (HEAAT). As a research tool, the HSS will enable the efficient implementation of new HEAAT and help in evaluating these under controlled real-life conditions. This new research tool will therefore be the missing link between artificial laboratory and complete real-life conditions. For a defined period, selected subjects can live in the HSS and experience the benefit of such a "Smart Home". The academic support in a real-life controlled living-environment enables continuous monitoring of behavior patterns and habits of healthy and ill persons, evaluation of new HEAAT, and conduction of requirements analysis and acceptance studies.

  17. AeroADL: applying the integration of the Suomi-NPP science algorithms with the Algorithm Development Library to the calibration and validation task

    NASA Astrophysics Data System (ADS)

    Houchin, J. S.

    2014-09-01

    A common problem for the off-line validation of the calibration algorithms and algorithm coefficients is being able to run science data through the exact same software used for on-line calibration of that data. The Joint Polar Satellite System (JPSS) program solved part of this problem by making the Algorithm Development Library (ADL) available, which allows the operational algorithm code to be compiled and run on a desktop Linux workstation using flat file input and output. However, this solved only part of the problem, as the toolkit and methods to initiate the processing of data through the algorithms were geared specifically toward the algorithm developer, not the calibration analyst. In algorithm development mode, a limited number of sets of test data are staged for the algorithm once, and then run through the algorithm over and over as the software is developed and debugged. In calibration analyst mode, we are continually running new data sets through the algorithm, which requires significant effort to stage each of those data sets for the algorithm without additional tools. AeroADL solves this second problem by providing a set of scripts that wrap the ADL tools, providing both efficient means to stage and process an input data set, to override static calibration coefficient look-up-tables (LUT) with experimental versions of those tables, and to manage a library containing multiple versions of each of the static LUT files in such a way that the correct set of LUTs required for each algorithm are automatically provided to the algorithm without analyst effort. Using AeroADL, The Aerospace Corporation's analyst team has demonstrated the ability to quickly and efficiently perform analysis tasks for both the VIIRS and OMPS sensors with minimal training on the software tools.

  18. Automatic detection and analysis of cell motility in phase-contrast time-lapse images using a combination of maximally stable extremal regions and Kalman filter approaches.

    PubMed

    Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L

    2014-01-01

    Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  19. Extending the BEAGLE library to a multi-FPGA platform.

    PubMed

    Jin, Zheming; Bakos, Jason D

    2013-01-19

    Maximum Likelihood (ML)-based phylogenetic inference using Felsenstein's pruning algorithm is a standard method for estimating the evolutionary relationships amongst a set of species based on DNA sequence data, and is used in popular applications such as RAxML, PHYLIP, GARLI, BEAST, and MrBayes. The Phylogenetic Likelihood Function (PLF) and its associated scaling and normalization steps comprise the computational kernel for these tools. These computations are data intensive but contain fine grain parallelism that can be exploited by coprocessor architectures such as FPGAs and GPUs. A general purpose API called BEAGLE has recently been developed that includes optimized implementations of Felsenstein's pruning algorithm for various data parallel architectures. In this paper, we extend the BEAGLE API to a multiple Field Programmable Gate Array (FPGA)-based platform called the Convey HC-1. The core calculation of our implementation, which includes both the phylogenetic likelihood function (PLF) and the tree likelihood calculation, has an arithmetic intensity of 130 floating-point operations per 64 bytes of I/O, or 2.03 ops/byte. Its performance can thus be calculated as a function of the host platform's peak memory bandwidth and the implementation's memory efficiency, as 2.03 × peak bandwidth × memory efficiency. Our FPGA-based platform has a peak bandwidth of 76.8 GB/s and our implementation achieves a memory efficiency of approximately 50%, which gives an average throughput of 78 Gflops. This represents a ~40X speedup when compared with BEAGLE's CPU implementation on a dual Xeon 5520 and 3X speedup versus BEAGLE's GPU implementation on a Tesla T10 GPU for very large data sizes. The power consumption is 92 W, yielding a power efficiency of 1.7 Gflops per Watt. The use of data parallel architectures to achieve high performance for likelihood-based phylogenetic inference requires high memory bandwidth and a design methodology that emphasizes high memory efficiency. To achieve this objective, we integrated 32 pipelined processing elements (PEs) across four FPGAs. For the design of each PE, we developed a specialized synthesis tool to generate a floating-point pipeline with resource and throughput constraints to match the target platform. We have found that using low-latency floating-point operators can significantly reduce FPGA area and still meet timing requirement on the target platform. We found that this design methodology can achieve performance that exceeds that of a GPU-based coprocessor.

  20. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond.

    PubMed

    Bible, Paul W; Kanno, Yuka; Wei, Lai; Brooks, Stephen R; O'Shea, John J; Morasso, Maria I; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST's functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST's general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work.

  1. PAPST, a User Friendly and Powerful Java Platform for ChIP-Seq Peak Co-Localization Analysis and Beyond

    PubMed Central

    Bible, Paul W.; Kanno, Yuka; Wei, Lai; Brooks, Stephen R.; O’Shea, John J.; Morasso, Maria I.; Loganantharaj, Rasiah; Sun, Hong-Wei

    2015-01-01

    Comparative co-localization analysis of transcription factors (TFs) and epigenetic marks (EMs) in specific biological contexts is one of the most critical areas of ChIP-Seq data analysis beyond peak calling. Yet there is a significant lack of user-friendly and powerful tools geared towards co-localization analysis based exploratory research. Most tools currently used for co-localization analysis are command line only and require extensive installation procedures and Linux expertise. Online tools partially address the usability issues of command line tools, but slow response times and few customization features make them unsuitable for rapid data-driven interactive exploratory research. We have developed PAPST: Peak Assignment and Profile Search Tool, a user-friendly yet powerful platform with a unique design, which integrates both gene-centric and peak-centric co-localization analysis into a single package. Most of PAPST’s functions can be completed in less than five seconds, allowing quick cycles of data-driven hypothesis generation and testing. With PAPST, a researcher with or without computational expertise can perform sophisticated co-localization pattern analysis of multiple TFs and EMs, either against all known genes or a set of genomic regions obtained from public repositories or prior analysis. PAPST is a versatile, efficient, and customizable tool for genome-wide data-driven exploratory research. Creatively used, PAPST can be quickly applied to any genomic data analysis that involves a comparison of two or more sets of genomic coordinate intervals, making it a powerful tool for a wide range of exploratory genomic research. We first present PAPST’s general purpose features then apply it to several public ChIP-Seq data sets to demonstrate its rapid execution and potential for cutting-edge research with a case study in enhancer analysis. To our knowledge, PAPST is the first software of its kind to provide efficient and sophisticated post peak-calling ChIP-Seq data analysis as an easy-to-use interactive application. PAPST is available at https://github.com/paulbible/papst and is a public domain work. PMID:25970601

  2. Modeling of threonine requirement in fast-growing chickens, depending on age, sex, protein deposition, and dietary threonine efficiency.

    PubMed

    Samadi; Liebert, F

    2006-11-01

    In addition to dose-response studies, modeling of N utilization, depending on intake of the first limiting amino acid in the diet, is one of the tools for assessing amino acid requirements in growing animals. Based on a verified nonlinear N-utilization model and following the principles of the diet dilution technique, N-balance experiments were conducted to estimate the Thr requirement of fast-growing chickens (genotype Cobb), depending on age, sex, CP deposition. and efficiency of dietary Thr utilization. Different predictions were made for the feed intake to conclude the optimal Thr concentration in the feed. The results are based on N-balance experiments with a total of 144 male and 144 female growing chickens within 4 age periods (I: 10 to 25 d; II: 30 to 45 d; III: 50 to 65 d; IV: 70 to 85 d), using diets with graded protein supply (6.6, 13, 19.6, 25.1, 31.8, and 37.6% CP in DM) from high-protein soybean meal with a constant amino acid ratio and Thr as the first limiting amino acid (3.87 g of Thr/100 g of CP; dietary Lys:Thr = 1:0.54). The observed optimal Thr concentration (% of feed) was influenced by age, sex, level of CP deposition, dietary efficiency of Thr utilization, and predicted feed intake. For male chickens, assuming an average CP deposition (60% of the potential) and average efficiency of Thr utilization, 0.78% (10 to 25 d), 0.73% (30 to 45 d), 0.65% (50 to 65 d), and 0.55% (70 to 85 d) total dietary Thr were observed as optimal total Thr concentration in the diet (corresponding to 60, 135, 160, and 180 g of daily feed intake, respectively). Data are discussed in context with the main factors of influence like age, sex, level of daily CP deposition, efficiency of dietary Thr utilization, and predicted feed intake.

  3. Efficiently modeling neural networks on massively parallel computers

    NASA Technical Reports Server (NTRS)

    Farber, Robert M.

    1993-01-01

    Neural networks are a very useful tool for analyzing and modeling complex real world systems. Applying neural network simulations to real world problems generally involves large amounts of data and massive amounts of computation. To efficiently handle the computational requirements of large problems, we have implemented at Los Alamos a highly efficient neural network compiler for serial computers, vector computers, vector parallel computers, and fine grain SIMD computers such as the CM-2 connection machine. This paper describes the mapping used by the compiler to implement feed-forward backpropagation neural networks for a SIMD (Single Instruction Multiple Data) architecture parallel computer. Thinking Machines Corporation has benchmarked our code at 1.3 billion interconnects per second (approximately 3 gigaflops) on a 64,000 processor CM-2 connection machine (Singer 1990). This mapping is applicable to other SIMD computers and can be implemented on MIMD computers such as the CM-5 connection machine. Our mapping has virtually no communications overhead with the exception of the communications required for a global summation across the processors (which has a sub-linear runtime growth on the order of O(log(number of processors)). We can efficiently model very large neural networks which have many neurons and interconnects and our mapping can extend to arbitrarily large networks (within memory limitations) by merging the memory space of separate processors with fast adjacent processor interprocessor communications. This paper will consider the simulation of only feed forward neural network although this method is extendable to recurrent networks.

  4. The ISO 50001 Impact Estimator Tool (IET 50001 V1.1.4) - User Guide and Introduction to the ISO 50001 Impacts Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Therkelsen, Peter L.; Rao, Prakash; Aghajanzadeh, Arian

    ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performancemore » improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.« less

  5. MOSAIC--A Modular Approach to Data Management in Epidemiological Studies.

    PubMed

    Bialke, M; Bahls, T; Havemann, C; Piegsa, J; Weitmann, K; Wegner, T; Hoffmann, W

    2015-01-01

    In the context of an increasing number of multi-centric studies providing data from different sites and sources the necessity for central data management (CDM) becomes undeniable. This is exacerbated by a multiplicity of featured data types, formats and interfaces. In relation to methodological medical research the definition of central data management needs to be broadened beyond the simple storage and archiving of research data. This paper highlights typical requirements of CDM for cohort studies and registries and illustrates how orientation for CDM can be provided by addressing selected data management challenges. Therefore in the first part of this paper a short review summarises technical, organisational and legal challenges for CDM in cohort studies and registries. A deduced set of typical requirements of CDM in epidemiological research follows. In the second part the MOSAIC project is introduced (a modular systematic approach to implement CDM). The modular nature of MOSAIC contributes to manage both technical and organisational challenges efficiently by providing practical tools. A short presentation of a first set of tools, aiming for selected CDM requirements in cohort studies and registries, comprises a template for comprehensive documentation of data protection measures, an interactive reference portal for gaining insights and sharing experiences, supplemented by modular software tools for generation and management of generic pseudonyms, for participant management and for sophisticated consent management. Altogether, work within MOSAIC addresses existing challenges in epidemiological research in the context of CDM and facilitates the standardized collection of data with pre-programmed modules and provided document templates. The necessary effort for in-house programming is reduced, which accelerates the start of data collection.

  6. A web platform for integrated surface water - groundwater modeling and data management

    NASA Astrophysics Data System (ADS)

    Fatkhutdinov, Aybulat; Stefan, Catalin; Junghanns, Ralf

    2016-04-01

    Model-based decision support systems are considered to be reliable and time-efficient tools for resources management in various hydrology related fields. However, searching and acquisition of the required data, preparation of the data sets for simulations as well as post-processing, visualization and publishing of the simulations results often requires significantly more work and time than performing the modeling itself. The purpose of the developed software is to combine data storage facilities, data processing instruments and modeling tools in a single platform which potentially can reduce time required for performing simulations, hence decision making. The system is developed within the INOWAS (Innovative Web Based Decision Support System for Water Sustainability under a Changing Climate) project. The platform integrates spatially distributed catchment scale rainfall - runoff, infiltration and groundwater flow models with data storage, processing and visualization tools. The concept is implemented in a form of a web-GIS application and is build based on free and open source components, including the PostgreSQL database management system, Python programming language for modeling purposes, Mapserver for visualization and publishing the data, Openlayers for building the user interface and others. Configuration of the system allows performing data input, storage, pre- and post-processing and visualization in a single not disturbed workflow. In addition, realization of the decision support system in the form of a web service provides an opportunity to easily retrieve and share data sets as well as results of simulations over the internet, which gives significant advantages for collaborative work on the projects and is able to significantly increase usability of the decision support system.

  7. Generalizable items of quantitative and qualitative cornerstones for personnel requirement of physicians in anesthesia

    PubMed Central

    Weiss, Manfred; Rossaint, Rolf; Iber, Thomas

    2017-01-01

    Anesthesiologists perform a broad spectrum of tasks. However, in many countries, there is no legal basis for personnel staffing of physicians in anesthesia. Also, the German diagnosis related groups system for refunding does not deliver such a basis. Thus, in 2006 a new calculation base for the personnel requirement that included an Excel calculation sheet was introduced by the German Board of Anesthesiologists (BDA) and the German Society of Anesthesiology and Intensive Care Medicine (DGAI), and updated in 2009 and 2015. Oriented primarily to organizational needs, in 2015, BDA/DGAI defined quantitative and qualitative cornerstones for personnel requirement of physicians in anesthesia, especially reflecting recent laws governing physician’s working conditions and competence in the field of anesthesia, as well as demands of strengthened legal rights of patients, patient care and safety. We present a workload-oriented model, integrating core working hours, shift work or standby duty, quality of care, efficiency of processes, legal, educational, controlling, local, organizational and economic aspects for calculating personnel demands. Auxiliary tables enable physicians to calculate personnel demands due to differing employee workload, non-patient oriented tasks and reimbursement of full-equivalents due to parental leave, prohibition of employment, or long-term illness. After 10 years of experience with the first calculation tool, we report the generalizable key aspects and items of a necessary calculation tool which may help physicians to justify realistic workload-oriented personnel staffing demands in anesthesia. A modular, flexible nature of a calculation tool should allow adaption to the respective legal and organizational demands of different countries. PMID:28529910

  8. Development of an Efficient Genome Editing Tool in Bacillus licheniformis Using CRISPR-Cas9 Nickase.

    PubMed

    Li, Kaifeng; Cai, Dongbo; Wang, Zhangqian; He, Zhili; Chen, Shouwen

    2018-03-15

    Bacillus strains are important industrial bacteria that can produce various biochemical products. However, low transformation efficiencies and a lack of effective genome editing tools have hindered its widespread application. Recently, clustered regularly interspaced short palindromic repeat (CRISPR)-Cas9 techniques have been utilized in many organisms as genome editing tools because of their high efficiency and easy manipulation. In this study, an efficient genome editing method was developed for Bacillus licheniformis using a CRISPR-Cas9 nickase integrated into the genome of B. licheniformis DW2 with overexpression driven by the P43 promoter. The yvmC gene was deleted using the CRISPR-Cas9n technique with homology arms of 1.0 kb as a representative example, and an efficiency of 100% was achieved. In addition, two genes were simultaneously disrupted with an efficiency of 11.6%, and the large DNA fragment bacABC (42.7 kb) was deleted with an efficiency of 79.0%. Furthermore, the heterologous reporter gene aprN , which codes for nattokinase in Bacillus subtilis , was inserted into the chromosome of B. licheniformis with an efficiency of 76.5%. The activity of nattokinase in the DWc9nΔ7/pP43SNT-S sacC strain reached 59.7 fibrinolytic units (FU)/ml, which was 25.7% higher than that of DWc9n/pP43SNT-S sacC Finally, the engineered strain DWc9nΔ7 (Δ epr Δ wprA Δ mpr Δ aprE Δ vpr Δ bprA Δ bacABC ), with multiple disrupted genes, was constructed using the CRISPR-Cas9n technique. Taken together, we have developed an efficient genome editing tool based on CRISPR-Cas9n in B. licheniformis This tool could be applied to strain improvement for future research. IMPORTANCE As important industrial bacteria, Bacillus strains have attracted significant attention due to their production of biological products. However, genetic manipulation of these bacteria is difficult. The CRISPR-Cas9 system has been applied to genome editing in some bacteria, and CRISPR-Cas9n was proven to be an efficient and precise tool in previous reports. The significance of our research is the development of an efficient, more precise, and systematic genome editing method for single-gene deletion, multiple-gene disruption, large DNA fragment deletion, and single-gene integration in Bacillus licheniformis via Cas9 nickase. We also applied this method to the genetic engineering of the host strain for protein expression. Copyright © 2018 American Society for Microbiology.

  9. Trial Promoter: A Web-Based Tool for Boosting the Promotion of Clinical Research Through Social Media.

    PubMed

    Reuter, Katja; Ukpolo, Francis; Ward, Edward; Wilson, Melissa L; Angyan, Praveen

    2016-06-29

    Scarce information about clinical research, in particular clinical trials, is among the top reasons why potential participants do not take part in clinical studies. Without volunteers, on the other hand, clinical research and the development of novel approaches to preventing, diagnosing, and treating disease are impossible. Promising digital options such as social media have the potential to work alongside traditional methods to boost the promotion of clinical research. However, investigators and research institutions are challenged to leverage these innovations while saving time and resources. To develop and test the efficiency of a Web-based tool that automates the generation and distribution of user-friendly social media messages about clinical trials. Trial Promoter is developed in Ruby on Rails, HTML, cascading style sheet (CSS), and JavaScript. In order to test the tool and the correctness of the generated messages, clinical trials (n=46) were randomized into social media messages and distributed via the microblogging social media platform Twitter and the social network Facebook. The percent correct was calculated to determine the probability with which Trial Promoter generates accurate messages. During a 10-week testing phase, Trial Promoter automatically generated and published 525 user-friendly social media messages on Twitter and Facebook. On average, Trial Promoter correctly used the message templates and substituted the message parameters (text, URLs, and disease hashtags) 97.7% of the time (1563/1600). Trial Promoter may serve as a promising tool to render clinical trial promotion more efficient while requiring limited resources. It supports the distribution of any research or other types of content. The Trial Promoter code and installation instructions are freely available online.

  10. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  11. Fast-NPS-A Markov Chain Monte Carlo-based analysis tool to obtain structural information from single-molecule FRET measurements

    NASA Astrophysics Data System (ADS)

    Eilert, Tobias; Beckers, Maximilian; Drechsler, Florian; Michaelis, Jens

    2017-10-01

    The analysis tool and software package Fast-NPS can be used to analyse smFRET data to obtain quantitative structural information about macromolecules in their natural environment. In the algorithm a Bayesian model gives rise to a multivariate probability distribution describing the uncertainty of the structure determination. Since Fast-NPS aims to be an easy-to-use general-purpose analysis tool for a large variety of smFRET networks, we established an MCMC based sampling engine that approximates the target distribution and requires no parameter specification by the user at all. For an efficient local exploration we automatically adapt the multivariate proposal kernel according to the shape of the target distribution. In order to handle multimodality, the sampler is equipped with a parallel tempering scheme that is fully adaptive with respect to temperature spacing and number of chains. Since the molecular surrounding of a dye molecule affects its spatial mobility and thus the smFRET efficiency, we introduce dye models which can be selected for every dye molecule individually. These models allow the user to represent the smFRET network in great detail leading to an increased localisation precision. Finally, a tool to validate the chosen model combination is provided. Programme Files doi:http://dx.doi.org/10.17632/7ztzj63r68.1 Licencing provisions: Apache-2.0 Programming language: GUI in MATLAB (The MathWorks) and the core sampling engine in C++ Nature of problem: Sampling of highly diverse multivariate probability distributions in order to solve for macromolecular structures from smFRET data. Solution method: MCMC algorithm with fully adaptive proposal kernel and parallel tempering scheme.

  12. Trial Promoter: A Web-Based Tool for Boosting the Promotion of Clinical Research Through Social Media

    PubMed Central

    Ukpolo, Francis; Ward, Edward; Wilson, Melissa L

    2016-01-01

    Background Scarce information about clinical research, in particular clinical trials, is among the top reasons why potential participants do not take part in clinical studies. Without volunteers, on the other hand, clinical research and the development of novel approaches to preventing, diagnosing, and treating disease are impossible. Promising digital options such as social media have the potential to work alongside traditional methods to boost the promotion of clinical research. However, investigators and research institutions are challenged to leverage these innovations while saving time and resources. Objective To develop and test the efficiency of a Web-based tool that automates the generation and distribution of user-friendly social media messages about clinical trials. Methods Trial Promoter is developed in Ruby on Rails, HTML, cascading style sheet (CSS), and JavaScript. In order to test the tool and the correctness of the generated messages, clinical trials (n=46) were randomized into social media messages and distributed via the microblogging social media platform Twitter and the social network Facebook. The percent correct was calculated to determine the probability with which Trial Promoter generates accurate messages. Results During a 10-week testing phase, Trial Promoter automatically generated and published 525 user-friendly social media messages on Twitter and Facebook. On average, Trial Promoter correctly used the message templates and substituted the message parameters (text, URLs, and disease hashtags) 97.7% of the time (1563/1600). Conclusions Trial Promoter may serve as a promising tool to render clinical trial promotion more efficient while requiring limited resources. It supports the distribution of any research or other types of content. The Trial Promoter code and installation instructions are freely available online. PMID:27357424

  13. A review of Integrated Vehicle Health Management tools for legacy platforms: Challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Esperon-Miguez, Manuel; John, Philip; Jennions, Ian K.

    2013-01-01

    Integrated Vehicle Health Management (IVHM) comprises a set of tools, technologies and techniques for automated detection, diagnosis and prognosis of faults in order to support platforms more efficiently. Specific challenges are faced when IVHM tools are to be retrofitted into legacy vehicles since major modifications are much more challenging than with platforms whose design can still be modified. The topics covered in this Review Paper include the state of the art of IVHM tools and how their characteristics match the requirements of legacy aircraft, a summary of problems faced in the past trying to retrofit IVHM tools both from a technical and organisational perspective and the current level of implementation of IVHM in industry. Although the technology has not reached the level necessary to implement IVHM to its full potential on every kind of component, significant progress has been achieved on rotating equipment, structures or electronics. Attempts to retrofit some of these tools in the past faced both technical difficulties and opposition by some stakeholders, the later being responsible for the failure of technically sound projects in more than one occasion. Nevertheless, despite these difficulties, products and services based on IVHM technology have started to be offered by the manufacturers and, what is more important, demanded by the operators, providing guidance on what the industry would demand from IVHM on legacy aircraft.

  14. [Optimization of end-tool parameters based on robot hand-eye calibration].

    PubMed

    Zhang, Lilong; Cao, Tong; Liu, Da

    2017-04-01

    A new one-time registration method was developed in this research for hand-eye calibration of a surgical robot to simplify the operation process and reduce the preparation time. And a new and practical method is introduced in this research to optimize the end-tool parameters of the surgical robot based on analysis of the error sources in this registration method. In the process with one-time registration method, firstly a marker on the end-tool of the robot was recognized by a fixed binocular camera, and then the orientation and position of the marker were calculated based on the joint parameters of the robot. Secondly the relationship between the camera coordinate system and the robot base coordinate system could be established to complete the hand-eye calibration. Because of manufacturing and assembly errors of robot end-tool, an error equation was established with the transformation matrix between the robot end coordinate system and the robot end-tool coordinate system as the variable. Numerical optimization was employed to optimize end-tool parameters of the robot. The experimental results showed that the one-time registration method could significantly improve the efficiency of the robot hand-eye calibration compared with the existing methods. The parameter optimization method could significantly improve the absolute positioning accuracy of the one-time registration method. The absolute positioning accuracy of the one-time registration method can meet the requirements of the clinical surgery.

  15. Requirements for clinical information modelling tools.

    PubMed

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Commercial Building Energy Asset Score

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software (Asset Scoring Tool) is designed to help building owners and managers to gain insight into the as-built efficiency of their buildings. It is a web tool where users can enter their building information and obtain an asset score report. The asset score report consists of modeled building energy use (by end use and by fuel type), building systems (envelope, lighting, heating, cooling, service hot water) evaluations, and recommended energy efficiency measures. The intended users are building owners and operators who have limited knowledge of building energy efficiency. The scoring tool collects minimum building data (~20 data entries) frommore » users and build a full-scale energy model using the inference functionalities from Facility Energy Decision System (FEDS). The scoring tool runs real-time building energy simulation using EnergyPlus and performs life-cycle cost analysis using FEDS. An API is also under development to allow the third-party applications to exchange data with the web service of the scoring tool.« less

  17. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  18. NASA's Planetary Data System: Support for the Delivery of Derived Data Sets at the Atmospheres Node

    NASA Astrophysics Data System (ADS)

    Chanover, Nancy J.; Beebe, Reta; Neakrase, Lynn; Huber, Lyle; Rees, Shannon; Hornung, Danae

    2015-11-01

    NASA’s Planetary Data System is charged with archiving electronic data products from NASA planetary missions that are sponsored by NASA’s Science Mission Directorate. This archive, currently organized by science disciplines, uses standards for describing and storing data that are designed to enable future scientists who are unfamiliar with the original experiments to analyze the data, and to do this using a variety of computer platforms, with no additional support. These standards address the data structure, description contents, and media design. The new requirement in the NASA ROSES-2015 Research Announcement to include a Data Management Plan will result in an increase in the number of derived data sets that are being delivered to the PDS. These data sets may come from the Planetary Data Archiving, Restoration and Tools (PDART) program, other Data Analysis Programs (DAPs) or be volunteered by individuals who are publishing the results of their analysis. In response to this increase, the PDS Atmospheres Node is developing a set of guidelines and user tools to make the process of archiving these derived data products more efficient. Here we provide a description of Atmospheres Node resources, including a letter of support for the proposal stage, a communication schedule for the planned archive effort, product label samples and templates in extensible markup language (XML), documentation templates, and validation tools necessary for producing a PDS4-compliant derived data bundle(s) efficiently and accurately.

  19. PGASO: A synthetic biology tool for engineering a cellulolytic yeast

    PubMed Central

    2012-01-01

    Background To achieve an economical cellulosic ethanol production, a host that can do both cellulosic saccharification and ethanol fermentation is desirable. However, to engineer a non-cellulolytic yeast to be such a host requires synthetic biology techniques to transform multiple enzyme genes into its genome. Results A technique, named Promoter-based Gene Assembly and Simultaneous Overexpression (PGASO), that employs overlapping oligonucleotides for recombinatorial assembly of gene cassettes with individual promoters, was developed. PGASO was applied to engineer Kluyveromycesmarxianus KY3, which is a thermo- and toxin-tolerant yeast. We obtained a recombinant strain, called KR5, that is capable of simultaneously expressing exoglucanase and endoglucanase (both of Trichodermareesei), a beta-glucosidase (from a cow rumen fungus), a neomycin phosphotransferase, and a green fluorescent protein. High transformation efficiency and accuracy were achieved as ~63% of the transformants was confirmed to be correct. KR5 can utilize beta-glycan, cellobiose or CMC as the sole carbon source for growth and can directly convert cellobiose and beta-glycan to ethanol. Conclusions This study provides the first example of multi-gene assembly in a single step in a yeast species other than Saccharomyces cerevisiae. We successfully engineered a yeast host with a five-gene cassette assembly and the new host is capable of co-expressing three types of cellulase genes. Our study shows that PGASO is an efficient tool for simultaneous expression of multiple enzymes in the kefir yeast KY3 and that KY3 can serve as a host for developing synthetic biology tools. PMID:22839502

  20. The JPL functional requirements tool

    NASA Technical Reports Server (NTRS)

    Giffin, Geoff; Skinner, Judith; Stoller, Richard

    1987-01-01

    Planetary spacecraft are complex vehicles which are built according to many thousands of requirements. Problems encountered in documenting and maintaining these requirements led to the current attempt to reduce or eliminate these problems by a computer automated data base Functional Requirements Tool. The tool developed at JPL and in use on several JPL Projects is described. The organization and functionality of the Tool, together with an explanation of the data base inputs, their relationships, and use are presented. Methods of interfacing with external documents, representation of tables and figures, and methods of approval and change processing are discussed. The options available for disseminating information from the Tool are identified. The implementation of the Requirements Tool is outlined, and the operation is summarized. The conclusions drawn from this work is that the Requirements Tool represents a useful addition to the System Engineer's Tool kit, it is not currently available elsewhere, and a clear development path exists to expand the capabilities of the Tool to serve larger and more complex projects.

Top