Sample records for high performance tools

  1. Comparison of Performance Predictions for New Low-Thrust Trajectory Tools

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie

    2006-01-01

    Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.

  2. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  3. Laser Cladding of CPM Tool Steels on Hardened H13 Hot-Work Steel for Low-Cost High-Performance Automotive Tooling

    NASA Astrophysics Data System (ADS)

    Chen, J.; Xue, L.

    2012-06-01

    This paper summarizes our research on laser cladding of high-vanadium CPM® tool steels (3V, 9V, and 15V) onto the surfaces of low-cost hardened H13 hot-work tool steel to substantially enhance resistance against abrasive wear. The results provide great potential for fabricating high-performance automotive tooling (including molds and dies) at affordable cost. The microstructure and hardness development of the laser-clad tool steels so obtained are presented as well.

  4. Integration of tools for the Design and Assessment of High-Performance, Highly Reliable Computing Systems (DAHPHRS), phase 1

    NASA Technical Reports Server (NTRS)

    Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.

    1992-01-01

    Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.

  5. A Tool for Verification and Validation of Neural Network Based Adaptive Controllers for High Assurance Systems

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Schumann, Johann

    2004-01-01

    High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.

  6. Optics assembly for high power laser tools

    DOEpatents

    Fraze, Jason D.; Faircloth, Brian O.; Zediker, Mark S.

    2016-06-07

    There is provided a high power laser rotational optical assembly for use with, or in high power laser tools for performing high power laser operations. In particular, the optical assembly finds applications in performing high power laser operations on, and in, remote and difficult to access locations. The optical assembly has rotational seals and bearing configurations to avoid contamination of the laser beam path and optics.

  7. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  8. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  9. Nanopore sequencing technology and tools for genome assembly: computational analysis of the current state, bottlenecks and future directions.

    PubMed

    Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur

    2018-04-02

    Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious and effective choices for each step of the genome assembly pipeline using nanopore sequence data. Also, with the help of bottlenecks we have found, developers can improve the current tools or build new ones that are both accurate and fast, to overcome the high error rates of the nanopore sequencing technology.

  10. Software Tools on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of

  11. Moving Large Data Sets Over High-Performance Long Distance Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodson, Stephen W; Poole, Stephen W; Ruwart, Thomas

    2011-04-01

    In this project we look at the performance characteristics of three tools used to move large data sets over dedicated long distance networking infrastructure. Although performance studies of wide area networks have been a frequent topic of interest, performance analyses have tended to focus on network latency characteristics and peak throughput using network traffic generators. In this study we instead perform an end-to-end long distance networking analysis that includes reading large data sets from a source file system and committing large data sets to a destination file system. An evaluation of end-to-end data movement is also an evaluation of themore » system configurations employed and the tools used to move the data. For this paper, we have built several storage platforms and connected them with a high performance long distance network configuration. We use these systems to analyze the capabilities of three data movement tools: BBcp, GridFTP, and XDD. Our studies demonstrate that existing data movement tools do not provide efficient performance levels or exercise the storage devices in their highest performance modes. We describe the device information required to achieve high levels of I/O performance and discuss how this data is applicable in use cases beyond data movement performance.« less

  12. FTAPE: A fault injection tool to measure fault tolerance

    NASA Technical Reports Server (NTRS)

    Tsai, Timothy K.; Iyer, Ravishankar K.

    1995-01-01

    The paper introduces FTAPE (Fault Tolerance And Performance Evaluator), a tool that can be used to compare fault-tolerant computers. The tool combines system-wide fault injection with a controllable workload. A workload generator is used to create high stress conditions for the machine. Faults are injected based on this workload activity in order to ensure a high level of fault propagation. The errors/fault ratio and performance degradation are presented as measures of fault tolerance.

  13. Validation of a Low-Thrust Mission Design Tool Using Operational Navigation Software

    NASA Technical Reports Server (NTRS)

    Englander, Jacob A.; Knittel, Jeremy M.; Williams, Ken; Stanbridge, Dale; Ellison, Donald H.

    2017-01-01

    Design of flight trajectories for missions employing solar electric propulsion requires a suitably high-fidelity design tool. In this work, the Evolutionary Mission Trajectory Generator (EMTG) is presented as a medium-high fidelity design tool that is suitable for mission proposals. EMTG is validated against the high-heritage deep-space navigation tool MIRAGE, demonstrating both the accuracy of EMTG's model and an operational mission design and navigation procedure using both tools. The validation is performed using a benchmark mission to the Jupiter Trojans.

  14. High performance cutting using micro-textured tools and low pressure jet coolant

    NASA Astrophysics Data System (ADS)

    Obikawa, Toshiyuki; Nakatsukasa, Ryuta; Hayashi, Mamoru; Ohno, Tatsumi

    2018-05-01

    Tool inserts with different kinds of microtexture on the flank face were fabricated by laser irradiation for promoting the heat transfer from the tool face to the coolant. In addition to the micro-textured tools, jet coolant was applied to the tool tip from the side of the flank face, but under low-pressure conditions, to make Reynolds number of coolant as high as possible in the wedge shape zone between the tool flank and machined surface. First, the effect of jet coolant on the flank wear evolution was investigated using a tool without microtexture. The jet coolant showed an excellent improvement of the tool life in machining stainless steel SUS304 at higher cutting speeds. It was found that both the flow rate and velocity of jet coolant were indispensable to high performance cutting. Next, the effect of microtexture on the flank wear evolution was investigated using jet coolant. Three types of micro grooves extended tool life largely compared to the tool without microtexture. It was found that the depth of groove was one of important parameters affecting the tool life extension. As a result, the tool life was extended by more than l00 % using the microtextured tools and jet coolant compared to machining using flood coolant and a tool without microtexture.

  15. Development of a Brief Pre-Implementation Screening Tool to Identify Teachers Who Are at Risk for Not Implementing Intervention Curriculum and High-Implementing Teachers

    ERIC Educational Resources Information Center

    Wang, Bo; Stanton, Bonita; Lunn, Sonja; Patel, Pooja; Koci, Veronica; Deveaux, Lynette

    2017-01-01

    Few questionnaires have been developed to screen for potentially poor implementers of school-based interventions. This study combines teacher characteristics, perceptions, and teaching/training experiences to develop a short screening tool that can identify potential "low-performing" or "high-performing" teachers…

  16. Examining Students' Use of Online Annotation Tools in Support of Argumentative Reading

    ERIC Educational Resources Information Center

    Lu, Jingyan; Deng, Liping

    2013-01-01

    This study examined how students in a Hong Kong high school used Diigo, an online annotation tool, to support their argumentative reading activities. Two year 10 classes, a high-performance class (HPC) and an ordinary-performance class (OPC), highlighted passages of text and wrote and attached sticky notes to them to clarify argumentation…

  17. Trace Replay and Network Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acun, Bilge; Jain, Nikhil; Bhatele, Abhinav

    2015-03-23

    TraceR is a trace reply tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performances and understanding network behavior by simulating messaging in High Performance Computing applications on interconnection networks.

  18. Trace Replay and Network Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, Nikhil; Bhatele, Abhinav; Acun, Bilge

    TraceR Is a trace replay tool built upon the ROSS-based CODES simulation framework. TraceR can be used for predicting network performance and understanding network behavior by simulating messaging In High Performance Computing applications on interconnection networks.

  19. HEPDOOP: High-Energy Physics Analysis using Hadoop

    NASA Astrophysics Data System (ADS)

    Bhimji, W.; Bristow, T.; Washbrook, A.

    2014-06-01

    We perform a LHC data analysis workflow using tools and data formats that are commonly used in the "Big Data" community outside High Energy Physics (HEP). These include Apache Avro for serialisation to binary files, Pig and Hadoop for mass data processing and Python Scikit-Learn for multi-variate analysis. Comparison is made with the same analysis performed with current HEP tools in ROOT.

  20. A corpus of full-text journal articles is a robust evaluation tool for revealing differences in performance of biomedical natural language processing tools

    PubMed Central

    2012-01-01

    Background We introduce the linguistic annotation of a corpus of 97 full-text biomedical publications, known as the Colorado Richly Annotated Full Text (CRAFT) corpus. We further assess the performance of existing tools for performing sentence splitting, tokenization, syntactic parsing, and named entity recognition on this corpus. Results Many biomedical natural language processing systems demonstrated large differences between their previously published results and their performance on the CRAFT corpus when tested with the publicly available models or rule sets. Trainable systems differed widely with respect to their ability to build high-performing models based on this data. Conclusions The finding that some systems were able to train high-performing models based on this corpus is additional evidence, beyond high inter-annotator agreement, that the quality of the CRAFT corpus is high. The overall poor performance of various systems indicates that considerable work needs to be done to enable natural language processing systems to work well when the input is full-text journal articles. The CRAFT corpus provides a valuable resource to the biomedical natural language processing community for evaluation and training of new models for biomedical full text publications. PMID:22901054

  1. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    . Development Tools View list of tools for build automation, version control, and high-level or specialized scripting. Toolchains Learn about the available toolchains to build applications from source code

  2. Evacuation performance evaluation tool.

    PubMed

    Farra, Sharon; Miller, Elaine T; Gneuhs, Matthew; Timm, Nathan; Li, Gengxin; Simon, Ashley; Brady, Whittney

    2016-01-01

    Hospitals conduct evacuation exercises to improve performance during emergency events. An essential aspect in this process is the creation of reliable and valid evaluation tools. The objective of this article is to describe the development and implications of a disaster evacuation performance tool that measures one portion of the very complex process of evacuation. Through the application of the Delphi technique and DeVellis's framework, disaster and neonatal experts provided input in developing this performance evaluation tool. Following development, content validity and reliability of this tool were assessed. Large pediatric hospital and medical center in the Midwest. The tool was pilot tested with an administrative, medical, and nursing leadership group and then implemented with a group of 68 healthcare workers during a disaster exercise of a neonatal intensive care unit (NICU). The tool has demonstrated high content validity with a scale validity index of 0.979 and inter-rater reliability G coefficient (0.984, 95% CI: 0.948-0.9952). The Delphi process based on the conceptual framework of DeVellis yielded a psychometrically sound evacuation performance evaluation tool for a NICU.

  3. Using Performance Tools to Support Experiments in HPC Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian

    2014-01-01

    The high performance computing (HPC) community is working to address fault tolerance and resilience concerns for current and future large scale computing platforms. This is driving enhancements in the programming environ- ments, specifically research on enhancing message passing libraries to support fault tolerant computing capabilities. The community has also recognized that tools for resilience experimentation are greatly lacking. However, we argue that there are several parallels between performance tools and resilience tools . As such, we believe the rich set of HPC performance-focused tools can be extended (repurposed) to benefit the resilience community. In this paper, we describe the initialmore » motivation to leverage standard HPC per- formance analysis techniques to aid in developing diagnostic tools to assist fault tolerance experiments for HPC applications. These diagnosis procedures help to provide context for the system when the errors (failures) occurred. We describe our initial work in leveraging an MPI performance trace tool to assist in provid- ing global context during fault injection experiments. Such tools will assist the HPC resilience community as they extend existing and new application codes to support fault tolerances.« less

  4. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing. The PRIMA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-­end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-­performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-­fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish these objectives: (1) refactor TAU and Scalasca performance system components for core code sharing and (2) integrate TAU and Scalasca functionality through data interfaces, formats, and utilities. As presented in this report, the project has completed these goals. In addition to shared technical advances, the groups have worked to engage with users through application performance engineering and tools training. In this regard, the project benefits from the close interactions the teams have with national laboratories in the United States and Germany. We have also sought to enhance our interactions through joint tutorials and outreach. UO has become a member of the Virtual Institute of High-­Productivity Supercomputing (VI-­HPS) established by the Helmholtz Association of German Research Centres as a center of excellence, focusing on HPC tools for diagnosing programming errors and optimizing performance. UO and FZJ have conducted several VI-­HPS training activities together within the past three years.« less

  5. Performance Refactoring of Instrumentation, Measurement, and Analysis Technologies for Petascale Computing: the PRIMA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Wolf, Felix G.

    2014-01-31

    The growing number of cores provided by today’s high-end computing systems present substantial challenges to application developers in their pursuit of parallel efficiency. To find the most effective optimization strategy, application developers need insight into the runtime behavior of their code. The University of Oregon (UO) and the Juelich Supercomputing Centre of Forschungszentrum Juelich (FZJ) develop the performance analysis tools TAU and Scalasca, respectively, which allow high-performance computing (HPC) users to collect and analyze relevant performance data – even at very large scales. TAU and Scalasca are considered among the most advanced parallel performance systems available, and are used extensivelymore » across HPC centers in the U.S., Germany, and around the world. The TAU and Scalasca groups share a heritage of parallel performance tool research and partnership throughout the past fifteen years. Indeed, the close interactions of the two groups resulted in a cross-fertilization of tool ideas and technologies that pushed TAU and Scalasca to what they are today. It also produced two performance systems with an increasing degree of functional overlap. While each tool has its specific analysis focus, the tools were implementing measurement infrastructures that were substantially similar. Because each tool provides complementary performance analysis, sharing of measurement results is valuable to provide the user with more facets to understand performance behavior. However, each measurement system was producing performance data in different formats, requiring data interoperability tools to be created. A common measurement and instrumentation system was needed to more closely integrate TAU and Scalasca and to avoid the duplication of development and maintenance effort. The PRIMA (Performance Refactoring of Instrumentation, Measurement, and Analysis) project was proposed over three years ago as a joint international effort between UO and FZJ to accomplish these objectives: (1) refactor TAU and Scalasca performance system components for core code sharing and (2) integrate TAU and Scalasca functionality through data interfaces, formats, and utilities. As presented in this report, the project has completed these goals. In addition to shared technical advances, the groups have worked to engage with users through application performance engineering and tools training. In this regard, the project benefits from the close interactions the teams have with national laboratories in the United States and Germany. We have also sought to enhance our interactions through joint tutorials and outreach. UO has become a member of the Virtual Institute of High-Productivity Supercomputing (VI-HPS) established by the Helmholtz Association of German Research Centres as a center of excellence, focusing on HPC tools for diagnosing programming errors and optimizing performance. UO and FZJ have conducted several VI-HPS training activities together within the past three years.« less

  6. A Queue Simulation Tool for a High Performance Scientific Computing Center

    NASA Technical Reports Server (NTRS)

    Spear, Carrie; McGalliard, James

    2007-01-01

    The NASA Center for Computational Sciences (NCCS) at the Goddard Space Flight Center provides high performance highly parallel processors, mass storage, and supporting infrastructure to a community of computational Earth and space scientists. Long running (days) and highly parallel (hundreds of CPUs) jobs are common in the workload. NCCS management structures batch queues and allocates resources to optimize system use and prioritize workloads. NCCS technical staff use a locally developed discrete event simulation tool to model the impacts of evolving workloads, potential system upgrades, alternative queue structures and resource allocation policies.

  7. Performance Analysis of GYRO: A Tool Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worley, P.; Roth, P.; Candy, J.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manualmore » analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.« less

  8. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  9. Knowledge-Acquisition Tool For Expert System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Regenie, Victoria A.

    1988-01-01

    Digital flight-control systems monitored by computer program that evaluates and recommends. Flight-systems engineers for advanced, high-performance aircraft use knowlege-acquisition tool for expert-system flight-status monitor suppling interpretative data. Interpretative function especially important in time-critical, high-stress situations because it facilitates problem identification and corrective strategy. Conditions evaluated and recommendations made by ground-based engineers having essential knowledge for analysis and monitoring of performances of advanced aircraft systems.

  10. Self-Reacting Friction Stir Welding for Aluminum Alloy Circumferential Weld Applications

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Cantrell, Mark; Carter, Robert

    2003-01-01

    Friction stir welding is an innovative weld process that continues to grow in use, in the commercial, defense, and space sectors. It produces high quality and high strength welds in aluminum alloys. The process consists of a rotating weld pin tool that plasticizes material through friction. The plasticized material is welded by applying a high weld forge force through the weld pin tool against the material during pin tool rotation. The high weld forge force is reacted against an anvil and a stout tool structure. A variation of friction stir welding currently being evaluated is self-reacting friction stir welding. Self-reacting friction stir welding incorporates two opposing shoulders on the crown and root sides of the weld joint. In self-reacting friction stir welding, the weld forge force is reacted against the crown shoulder portion of the weld pin tool by the root shoulder. This eliminates the need for a stout tooling structure to react the high weld forge force required in the typical friction stir weld process. Therefore, the self-reacting feature reduces tooling requirements and, therefore, process implementation costs. This makes the process attractive for aluminum alloy circumferential weld applications. To evaluate the application of self-reacting friction stir welding for aluminum alloy circumferential welding, a feasibility study was performed. The study consisted of performing a fourteen-foot diameter aluminum alloy circumferential demonstration weld using typical fusion weld tooling. To accomplish the demonstration weld, weld and tack weld development were performed and fourteen-foot diameter rings were fabricated. Weld development consisted of weld pin tool selection and the generation of a process map and envelope. Tack weld development evaluated gas tungsten arc welding and friction stir welding for tack welding rings together for circumferential welding. As a result of the study, a successful circumferential demonstration weld was produced leading the way for future circumferential weld implementation.

  11. Wear behavior of carbide tool coated with Yttria-stabilized zirconia nano particles.

    NASA Astrophysics Data System (ADS)

    Jadhav, Pavandatta M.; Reddy, Narala Suresh Kumar

    2018-04-01

    Wear mechanism takes predominant role in reducing the tool life during machining of Titanium alloy. Challenges of wear mechanisms such as variation in chip, high pressure loads and spring back are responsible for tool wear. In addition, many tool materials are inapt for machining due to low thermal conductivity and volume specific heat of these materials results in high cutting temperature during machining. To confront this issue Electrostatic Spray Coating (ESC) coating technique is utilized to enhance the tool life to an acceptable level. The Yttria Stabilized Zirconia (YSZ) acts as a thermal barrier coating having high thermal expansion coefficient and thermal shock resistance. This investigation focuses on the influence of YSZ nanocoating on the tungsten carbide tool material and improve the machinability of Ti-6Al-4V alloy. YSZ nano powder was coated on the tungsten carbide pin by using ESC technique. The coatings have been tested for wear and friction behavior by using a pin-on-disc tribological tester. The dry sliding wear test was performed on Titanium alloy (Ti-6Al-4V) disc and YSZ coated tungsten carbide (pin) at ambient atmosphere. The performance parameters like wear rate and temperature rise were considered upon performing the dry sliding test on Ti-6Al-4V alloy disc. The performance parameters were calculated by using coefficient of friction and frictional force values which were obtained from the pin on disc test. Substantial resistance to wear was achieved by the coating.

  12. Robot based deposition of WC-Co HVOF coatings on HSS cutting tools as a substitution for solid cemented carbide cutting tools

    NASA Astrophysics Data System (ADS)

    Tillmann, W.; Schaak, C.; Biermann, D.; Aßmuth, R.; Goeke, S.

    2017-03-01

    Cemented carbide (hard metal) cutting tools are the first choice to machine hard materials or to conduct high performance cutting processes. Main advantages of cemented carbide cutting tools are their high wear resistance (hardness) and good high temperature strength. In contrast, cemented carbide cutting tools are characterized by a low toughness and generate higher production costs, especially due to limited resources. Usually, cemented carbide cutting tools are produced by means of powder metallurgical processes. Compared to conventional manufacturing routes, these processes are more expensive and only a limited number of geometries can be realized. Furthermore, post-processing and preparing the cutting edges in order to achieve high performance tools is often required. In the present paper, an alternative method to substitute solid cemented carbide cutting tools is presented. Cutting tools made of conventional high speed steels (HSS) were coated with thick WC-Co (88/12) layers by means of thermal spraying (HVOF). The challenge is to obtain a dense, homogenous, and near-net-shape coating on the flanks and the cutting edge. For this purpose, different coating strategies were realized using an industrial robot. The coating properties were subsequently investigated. After this initial step, the surfaces of the cutting tools were ground and selected cutting edges were prepared by means of wet abrasive jet machining to achieve a smooth and round micro shape. Machining tests were conducted with these coated, ground and prepared cutting tools. The occurring wear phenomena were analyzed and compared to conventional HSS cutting tools. Overall, the results of the experiments proved that the coating withstands mechanical stresses during machining. In the conducted experiments, the coated cutting tools showed less wear than conventional HSS cutting tools. With respect to the initial wear resistance, additional benefits can be obtained by preparing the cutting edge by means of wet abrasive jet machining.

  13. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|Speedshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less

  14. Tool simplifies machining of pipe ends for precision welding

    NASA Technical Reports Server (NTRS)

    Matus, S. T.

    1969-01-01

    Single tool prepares a pipe end for precision welding by simultaneously performing internal machining, end facing, and bevel cutting to specification standards. The machining operation requires only one milling adjustment, can be performed quickly, and produces the high quality pipe-end configurations required to ensure precision-welded joints.

  15. Invited review: Helping dairy farmers to improve economic performance utilizing data-driving decision support tools.

    PubMed

    Cabrera, V E

    2018-01-01

    The objective of this review paper is to describe the development and application of a suite of more than 40 computerized dairy farm decision support tools contained at the University of Wisconsin-Madison (UW) Dairy Management website http://DairyMGT.info. These data-driven decision support tools are aimed to help dairy farmers improve their decision-making, environmental stewardship and economic performance. Dairy farm systems are highly dynamic in which changing market conditions and prices, evolving policies and environmental restrictions together with every time more variable climate conditions determine performance. Dairy farm systems are also highly integrated with heavily interrelated components such as the dairy herd, soils, crops, weather and management. Under these premises, it is critical to evaluate a dairy farm following a dynamic integrated system approach. For this approach, it is crucial to use meaningful data records, which are every time more available. These data records should be used within decision support tools for optimal decision-making and economic performance. Decision support tools in the UW-Dairy Management website (http://DairyMGT.info) had been developed using combination and adaptation of multiple methods together with empirical techniques always with the primary goal for these tools to be: (1) highly user-friendly, (2) using the latest software and computer technologies, (3) farm and user specific, (4) grounded on the best scientific information available, (5) remaining relevant throughout time and (6) providing fast, concrete and simple answers to complex farmers' questions. DairyMGT.info is a translational innovative research website in various areas of dairy farm management that include nutrition, reproduction, calf and heifer management, replacement, price risk and environment. This paper discusses the development and application of 20 selected (http://DairyMGT.info) decision support tools.

  16. New Tools For Understanding Microbial Diversity Using High-throughput Sequence Data

    NASA Astrophysics Data System (ADS)

    Knight, R.; Hamady, M.; Liu, Z.; Lozupone, C.

    2007-12-01

    High-throughput sequencing techniques such as 454 are straining the limits of tools traditionally used to build trees, choose OTUs, and perform other essential sequencing tasks. We have developed a workflow for phylogenetic analysis of large-scale sequence data sets that combines existing tools, such as the Arb phylogeny package and the NAST multiple sequence alignment tool, with new methods for choosing and clustering OTUs and for performing phylogenetic community analysis with UniFrac. This talk discusses the cyberinfrastructure we are developing to support the human microbiome project, and the application of these workflows to analyze very large data sets that contrast the gut microbiota with a range of physical environments. These tools will ultimately help to define core and peripheral microbiomes in a range of environments, and will allow us to understand the physical and biotic factors that contribute most to differences in microbial diversity.

  17. Progress in the development and integration of fluid flow control tools in paper microfluidics.

    PubMed

    Fu, Elain; Downs, Corey

    2017-02-14

    Paper microfluidics is a rapidly growing subfield of microfluidics in which paper-like porous materials are used to create analytical devices. There is a need for higher performance field-use tests for many application domains including human disease diagnosis, environmental monitoring, and veterinary medicine. A key factor in creating high performance paper-based devices is the ability to manipulate fluid flow within the devices. This critical review is focused on the progress that has been made in (i) the development of fluid flow control tools and (ii) the integration of those tools into paper microfluidic devices. Further, we strive to be comprehensive in our presentation and provide historical context through discussion and performance comparisons, when possible, of both relevant earlier work and recent work. Finally, we discuss the major areas of focus for fluid flow methods development to advance the potential of paper microfluidics for high-performance field applications.

  18. ESH assessment of advanced lithography materials and processes

    NASA Astrophysics Data System (ADS)

    Worth, Walter F.; Mallela, Ram

    2004-05-01

    The ESH Technology group at International SEMATECH is conducting environment, safety, and health (ESH) assessments in collaboration with the lithography technologists evaluating the performance of an increasing number of new materials and technologies being considered for advanced lithography such as 157nm photresist and extreme ultraviolet (EUV). By performing data searches for 75 critical data types, emissions characterizations, and industrial hygiene (IH) monitoring during the use of the resist candidates, it has been shown that the best performing resist formulations, so far, appear to be free of potential ESH concerns. The ESH assessment of the EUV lithography tool that is being developed for SEMATECH has identified several features of the tool that are of ESH concern: high energy consumption, poor energy conversion efficiency, tool complexity, potential ergonomic and safety interlock issues, use of high powered laser(s), generation of ionizing radiation (soft X-rays), need for adequate shielding, and characterization of the debris formed by the extreme temperature of the plasma. By bringing these ESH challenges to the attention of the technologists and tool designers, it is hoped that the processes and tools can be made more ESH friendly.

  19. Performance comparison of SNP detection tools with illumina exome sequencing data—an assessment using both family pedigree information and sample-matched SNP array data

    PubMed Central

    Yi, Ming; Zhao, Yongmei; Jia, Li; He, Mei; Kebebew, Electron; Stephens, Robert M.

    2014-01-01

    To apply exome-seq-derived variants in the clinical setting, there is an urgent need to identify the best variant caller(s) from a large collection of available options. We have used an Illumina exome-seq dataset as a benchmark, with two validation scenarios—family pedigree information and SNP array data for the same samples, permitting global high-throughput cross-validation, to evaluate the quality of SNP calls derived from several popular variant discovery tools from both the open-source and commercial communities using a set of designated quality metrics. To the best of our knowledge, this is the first large-scale performance comparison of exome-seq variant discovery tools using high-throughput validation with both Mendelian inheritance checking and SNP array data, which allows us to gain insights into the accuracy of SNP calling through such high-throughput validation in an unprecedented way, whereas the previously reported comparison studies have only assessed concordance of these tools without directly assessing the quality of the derived SNPs. More importantly, the main purpose of our study was to establish a reusable procedure that applies high-throughput validation to compare the quality of SNP discovery tools with a focus on exome-seq, which can be used to compare any forthcoming tool(s) of interest. PMID:24831545

  20. Coaching as a Performance Improvement Tool at School

    ERIC Educational Resources Information Center

    Yirci, Ramazan; Karakose, Turgut; Kocabas, Ibrahim

    2016-01-01

    The purpose of this study is to examine the current literature and have an insight about coaching as a performance improvement tool at school. In today's world, schools have to survive and keep their organizational success in the highest level because of the high expectations from school stakeholders. Taking place in such a fierce competitive…

  1. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  2. PC tools for project management: Programs and the state-of-the-practice

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.; Freedman, Glenn B.; Dede, Christopher J.; Lidwell, William; Learned, David

    1990-01-01

    The use of microcomputer tools for NASA project management; which features are the most useful; the impact of these tools on job performance and individual style; and the prospects for new features in project management tools and related tools are addressed. High, mid, and low end PM tools are examined. The pro's and con's of the tools are assessed relative to various tasks. The strengths and weaknesses of the tools are presented through cases and demonstrations.

  3. WinHPC System Software | High-Performance Computing | NREL

    Science.gov Websites

    Software WinHPC System Software Learn about the software applications, tools, toolchains, and for industrial applications. Intel Compilers Development Tool, Toolchain Suite featuring an industry

  4. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  5. Validation of an explanatory tool for data-fused displays for high-technology future aircraft

    NASA Astrophysics Data System (ADS)

    Fletcher, Georgina C. L.; Shanks, Craig R.; Selcon, Stephen J.

    1996-05-01

    As the number of sensor and data sources in the military cockpit increases, pilots will suffer high levels of workload which could result in reduced performance and the loss of situational awareness. A DRA research program has been investigating the use of data-fused displays in decision support and has developed and laboratory-tested an explanatory tool for displaying information in air combat scenarios. The tool has been designed to provide pictorial explanations of data that maintain situational awareness by involving the pilot in the hostile aircraft threat assessment task. This paper reports a study carried out to validate the success of the explanatory tool in a realistic flight simulation facility. Aircrew were asked to perform a threat assessment task, either with or without the explanatory tool providing information in the form of missile launch success zone envelopes, while concurrently flying a waypoint course within set flight parameters. The results showed that there was a significant improvement (p less than 0.01) in threat assessment accuracy of 30% when using the explanatory tool. This threat assessment performance advantage was achieved without a trade-off with flying task performance. Situational awareness measures showed no general differences between the explanatory and control conditions, but significant learning effects suggested that the explanatory tool makes the task initially more intuitive and hence less demanding on the pilots' attentional resources. The paper concludes that DRA's data-fused explanatory tool is successful at improving threat assessment accuracy in a realistic simulated flying environment, and briefly discusses the requirements for further research in the area.

  6. Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft

    NASA Astrophysics Data System (ADS)

    Boozer, Charles Maxwell

    A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.

  7. Open | SpeedShop: An Open Source Infrastructure for Parallel Performance Analysis

    DOE PAGES

    Schulz, Martin; Galarowicz, Jim; Maghrak, Don; ...

    2008-01-01

    Over the last decades a large number of performance tools has been developed to analyze and optimize high performance applications. Their acceptance by end users, however, has been slow: each tool alone is often limited in scope and comes with widely varying interfaces and workflow constraints, requiring different changes in the often complex build and execution infrastructure of the target application. We started the Open | SpeedShop project about 3 years ago to overcome these limitations and provide efficient, easy to apply, and integrated performance analysis for parallel systems. Open | SpeedShop has two different faces: it provides an interoperable tool set covering themore » most common analysis steps as well as a comprehensive plugin infrastructure for building new tools. In both cases, the tools can be deployed to large scale parallel applications using DPCL/Dyninst for distributed binary instrumentation. Further, all tools developed within or on top of Open | SpeedShop are accessible through multiple fully equivalent interfaces including an easy-to-use GUI as well as an interactive command line interface reducing the usage threshold for those tools.« less

  8. Programming Tools: Status, Evaluation, and Comparison

    NASA Technical Reports Server (NTRS)

    Cheng, Doreen Y.; Cooper, D. M. (Technical Monitor)

    1994-01-01

    In this tutorial I will first describe the characteristics of scientific applications and their developers, and describe the computing environment in a typical high-performance computing center. I will define the user requirements for tools that support application portability and present the difficulties to satisfy them. These form the basis of the evaluation and comparison of the tools. I will then describe the tools available in the market and the tools available in the public domain. Specifically, I will describe the tools for converting sequential programs, tools for developing portable new programs, tools for debugging and performance tuning, tools for partitioning and mapping, and tools for managing network of resources. I will introduce the main goals and approaches of the tools, and show main features of a few tools in each category. Meanwhile, I will compare tool usability for real-world application development and compare their different technological approaches. Finally, I will indicate the future directions of the tools in each category.

  9. Performance Under Stress Conditions During Multidisciplinary Team Immersive Pediatric Simulations.

    PubMed

    Ghazali, Daniel Aiham; Darmian-Rafei, Ivan; Ragot, Stéphanie; Oriot, Denis

    2018-06-01

    The primary objective was to determine whether technical and nontechnical performances were in some way correlated during immersive simulation. Performance was measured among French Emergency Medical Service workers at an individual and a team level. Secondary objectives were to assess stress response through collection of physiologic markers (salivary cortisol, heart rate, the proportion derived by dividing the number of interval differences of successive normal-to-normal intervals > 50 ms by the total number of normal-to-normal intervals [pNN50], low- and high-frequency ratio) and affective data (self-reported stress, confidence, and dissatisfaction), and to correlate them to performance scores. Prospective observational study performed as part of a larger randomized controlled trial. Medical simulation laboratory. Forty-eight participants distributed among 12 Emergency Medical System teams. Individual and team performance measures and individual stress response were assessed during a high-fidelity simulation. Technical performance was assessed by the intraosseous access performance scale and the Team Average Performance Assessment Scale; nontechnical performance by the Behavioral Assessment Tool for leaders, and the Clinical Teamwork Scale. Stress markers (salivary cortisol, heart rate, pNN50, low- and high-frequency ratio) were measured both before (T1) and after the session (T2). Participants self-reported stress before and during the simulation, self-confidence, and perception of dissatisfaction with team performance, rated on a scale from 0 to 10. Scores (out of 100 total points, mean ± SD) were intraosseous equals to 65.6 ± 14.4, Team Average Performance Assessment Scale equals to 44.6 ± 18.1, Behavioral Assessment Tool equals to 49.5 ± 22.0, Clinical Teamwork Scale equals to 50.3 ± 18.5. There was a strong correlation between Behavioral Assessment Tool and Clinical Teamwork Scale (Rho = 0.97; p = 0.001), and Behavioral Assessment Tool and Team Average Performance Assessment Scale (Rho = 0.73; p = 0.02). From T1 to T2, all stress markers (salivary cortisol, heart rate, pNN50, and low- and high-frequency ratio) displayed an increase in stress level (p < 0.001 for all). Self-confidence was positively correlated with performance (Clinical Teamwork Scale: Rho = 0.47; p = 0.001, Team Average Performance Assessment Scale: Rho = 0.46; p = 0.001). Dissatisfaction was negatively correlated with performance (Rho = -0.49; p = 0.0008 with Behavioral Assessment Tool, Rho = -0.47; p = 0.001 with Clinical Teamwork Scale, Rho = -0.51; p = 0.0004 with Team Average Performance Assessment Scale). No correlation between stress response and performance was found. There was a positive correlation between leader (Behavioral Assessment Tool) and team (Clinical Teamwork Scale and Team Average Performance Assessment Scale) performances. These performance scores were positively correlated with self-confidence and negatively correlated with dissatisfaction.

  10. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  11. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  12. High-Speed Edge Trimming of CFRP and Online Monitoring of Performance of Router Tools Using Acoustic Emission

    PubMed Central

    Prakash, Rangasamy; Krishnaraj, Vijayan; Zitoune, Redouane; Sheikh-Ahmad, Jamal

    2016-01-01

    Carbon fiber reinforced polymers (CFRPs) have found wide-ranging applications in numerous industrial fields such as aerospace, automotive, and shipping industries due to their excellent mechanical properties that lead to enhanced functional performance. In this paper, an experimental study on edge trimming of CFRP was done with various cutting conditions and different geometry of tools such as helical-, fluted-, and burr-type tools. The investigation involves the measurement of cutting forces for the different machining conditions and its effect on the surface quality of the trimmed edges. The modern cutting tools (router tools or burr tools) selected for machining CFRPs, have complex geometries in cutting edges and surfaces, and therefore a traditional method of direct tool wear evaluation is not applicable. An acoustic emission (AE) sensing was employed for on-line monitoring of the performance of router tools to determine the relationship between AE signal and length of machining for different kinds of geometry of tools. The investigation showed that the router tool with a flat cutting edge has better performance by generating lower cutting force and better surface finish with no delamination on trimmed edges. The mathematical modeling for the prediction of cutting forces was also done using Artificial Neural Network and Regression Analysis. PMID:28773919

  13. Early experiences in developing and managing the neuroscience gateway.

    PubMed

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas T

    2015-02-01

    The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway.

  14. Early experiences in developing and managing the neuroscience gateway

    PubMed Central

    Sivagnanam, Subhashini; Majumdar, Amit; Yoshimoto, Kenneth; Astakhov, Vadim; Bandrowski, Anita; Martone, MaryAnn; Carnevale, Nicholas. T.

    2015-01-01

    SUMMARY The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyber infrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for parallel computers and suitable for high performance computing machines. But using these tools on complex high performance computing machines remains a challenge because of issues with acquiring computer time on these machines located at national supercomputer centers, dealing with complex user interface of these machines, dealing with data management and retrieval. The Neuroscience Gateway is being developed to alleviate and/or hide these barriers to entry for computational neuroscientists. It hides or eliminates, from the point of view of the users, all the administrative and technical barriers and makes parallel neuronal simulation tools easily available and accessible on complex high performance computing machines. It handles the running of jobs and data management and retrieval. This paper shares the early experiences in bringing up this gateway and describes the software architecture it is based on, how it is implemented, and how users can use this for computational neuroscience research using high performance computing at the back end. We also look at parallel scaling of some publicly available neuronal models and analyze the recent usage data of the neuroscience gateway. PMID:26523124

  15. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  16. Building a Natural Language Processing Tool to Identify Patients With High Clinical Suspicion for Kawasaki Disease from Emergency Department Notes.

    PubMed

    Doan, Son; Maehara, Cleo K; Chaparro, Juan D; Lu, Sisi; Liu, Ruiling; Graham, Amanda; Berry, Erika; Hsu, Chun-Nan; Kanegaye, John T; Lloyd, David D; Ohno-Machado, Lucila; Burns, Jane C; Tremoulet, Adriana H

    2016-05-01

    Delayed diagnosis of Kawasaki disease (KD) may lead to serious cardiac complications. We sought to create and test the performance of a natural language processing (NLP) tool, the KD-NLP, in the identification of emergency department (ED) patients for whom the diagnosis of KD should be considered. We developed an NLP tool that recognizes the KD diagnostic criteria based on standard clinical terms and medical word usage using 22 pediatric ED notes augmented by Unified Medical Language System vocabulary. With high suspicion for KD defined as fever and three or more KD clinical signs, KD-NLP was applied to 253 ED notes from children ultimately diagnosed with either KD or another febrile illness. We evaluated KD-NLP performance against ED notes manually reviewed by clinicians and compared the results to a simple keyword search. KD-NLP identified high-suspicion patients with a sensitivity of 93.6% and specificity of 77.5% compared to notes manually reviewed by clinicians. The tool outperformed a simple keyword search (sensitivity = 41.0%; specificity = 76.3%). KD-NLP showed comparable performance to clinician manual chart review for identification of pediatric ED patients with a high suspicion for KD. This tool could be incorporated into the ED electronic health record system to alert providers to consider the diagnosis of KD. KD-NLP could serve as a model for decision support for other conditions in the ED. © 2016 by the Society for Academic Emergency Medicine.

  17. SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology

    NASA Technical Reports Server (NTRS)

    Zoladz, Thomas; Mitchell, William; Lunde, Kevin

    2010-01-01

    Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.

  18. Leveraging business intelligence to make better decisions: Part I.

    PubMed

    Reimers, Mona

    2014-01-01

    Data is the new currency. Business intelligence tools will provide better performing practices with a competitive intelligence advantage that will separate the high performers from the rest of the pack. Given the investments of time and money into our data systems, practice leaders must work to take every advantage and look at the datasets as a potential goldmine of business intelligence decision tools. A fresh look at decision tools created from practice data will create efficiencies and improve effectiveness for end-users and managers.

  19. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes

    PubMed Central

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D.; Day, Michele E.; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort. PMID:28269947

  20. Ensembles of NLP Tools for Data Element Extraction from Clinical Notes.

    PubMed

    Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D; Day, Michele E; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan

    2016-01-01

    Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort.

  1. High-Speed TCP Testing

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.

    1999-01-01

    Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.

  2. High energy PIXE: A tool to characterize multi-layer thick samples

    NASA Astrophysics Data System (ADS)

    Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.

    2018-02-01

    High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.

  3. Micro-optical fabrication by ultraprecision diamond machining and precision molding

    NASA Astrophysics Data System (ADS)

    Li, Hui; Li, Likai; Naples, Neil J.; Roblee, Jeffrey W.; Yi, Allen Y.

    2017-06-01

    Ultraprecision diamond machining and high volume molding for affordable high precision high performance optical elements are becoming a viable process in optical industry for low cost high quality microoptical component manufacturing. In this process, first high precision microoptical molds are fabricated using ultraprecision single point diamond machining followed by high volume production methods such as compression or injection molding. In the last two decades, there have been steady improvements in ultraprecision machine design and performance, particularly with the introduction of both slow tool and fast tool servo. Today optical molds, including freeform surfaces and microlens arrays, are routinely diamond machined to final finish without post machining polishing. For consumers, compression molding or injection molding provide efficient and high quality optics at extremely low cost. In this paper, first ultraprecision machine design and machining processes such as slow tool and fast too servo are described then both compression molding and injection molding of polymer optics are discussed. To implement precision optical manufacturing by molding, numerical modeling can be included in the future as a critical part of the manufacturing process to ensure high product quality.

  4. The Impact of Computer Simulations as Interactive Demonstration Tools on the Performance of Grade 11 Learners in Electromagnetism

    ERIC Educational Resources Information Center

    Kotoka, Jonas; Kriek, Jeanne

    2014-01-01

    The impact of computer simulations on the performance of 65 grade 11 learners in electromagnetism in a South African high school in the Mpumalanga province is investigated. Learners did not use the simulations individually, but teachers used them as an interactive demonstration tool. Basic concepts in electromagnetism are difficult to understand…

  5. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  6. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  7. Measuring Medical Housestaff Teamwork Performance Using Multiple Direct Observation Instruments: Comparing Apples and Apples.

    PubMed

    Weingart, Saul N; Yaghi, Omar; Wetherell, Matthew; Sweeney, Megan

    2018-04-10

    To examine the composition and concordance of existing instruments used to assess medical teams' performance. A trained observer joined 20 internal medicine housestaff teams for morning work rounds at Tufts Medical Center, a 415-bed Boston teaching hospital, from October through December 2015. The observer rated each team's performance using 9 teamwork observation instruments that examined domains including team structure, leadership, situation monitoring, mutual support, and communication. Observations recorded on paper forms were stored electronically. Scores were normalized from 1 (low) to 5 (high) to account for different rating scales. Overall mean scores were calculated and graphed; weighted scores adjusted for the number of items in each teamwork domain. Teamwork scores were analyzed using t-tests, pair-wise correlations, and the Kruskal-Wallis statistic, and team performance was compared across instruments by domain. The 9 tools incorporated 5 major domains, with 5-35 items per instrument for a total of 161 items per observation session. In weighted and unweighted analyses, the overall teamwork performance score for a given team on a given day varied by instrument. While all of the tools identified the same low outlier, high performers on some instruments were low performers on others. Inconsistent scores for a given team across instruments persisted in domain-level analyses. There was substantial variation in the rating of individual teams assessed concurrently by a single observer using multiple instruments. Since existing teamwork observation tools do not yield concordant assessments, researchers should create better tools for measuring teamwork performance.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, Benjamin A.

    We report on the use and design of a portable, extensible performance data collection tool motivated by modeling needs of the high performance computing systems co-design com- munity. The lightweight performance data collectors with Eiger support is intended to be a tailorable tool, not a shrink-wrapped library product, as pro ling needs vary widely. A single code markup scheme is reported which, based on compilation ags, can send perfor- mance data from parallel applications to CSV les, to an Eiger mysql database, or (in a non-database environment) to at les for later merging and loading on a host with mysqlmore » available. The tool supports C, C++, and Fortran applications.« less

  9. Development and testing of the cancer multidisciplinary team meeting observational tool (MDT-MOT)

    PubMed Central

    Harris, Jenny; Taylor, Cath; Sevdalis, Nick; Jalil, Rozh; Green, James S.A.

    2016-01-01

    Abstract Objective To develop a tool for independent observational assessment of cancer multidisciplinary team meetings (MDMs), and test criterion validity, inter-rater reliability/agreement and describe performance. Design Clinicians and experts in teamwork used a mixed-methods approach to develop and refine the tool. Study 1 observers rated pre-determined optimal/sub-optimal MDM film excerpts and Study 2 observers independently rated video-recordings of 10 MDMs. Setting Study 2 included 10 cancer MDMs in England. Participants Testing was undertaken by 13 health service staff and a clinical and non-clinical observer. Intervention None. Main Outcome Measures Tool development, validity, reliability/agreement and variability in MDT performance. Results Study 1: Observers were able to discriminate between optimal and sub-optimal MDM performance (P ≤ 0.05). Study 2: Inter-rater reliability was good for 3/10 domains. Percentage of absolute agreement was high (≥80%) for 4/10 domains and percentage agreement within 1 point was high for 9/10 domains. Four MDTs performed well (scored 3+ in at least 8/10 domains), 5 MDTs performed well in 6–7 domains and 1 MDT performed well in only 4 domains. Leadership and chairing of the meeting, the organization and administration of the meeting, and clinical decision-making processes all varied significantly between MDMs (P ≤ 0.01). Conclusions MDT-MOT demonstrated good criterion validity. Agreement between clinical and non-clinical observers (within one point on the scale) was high but this was inconsistent with reliability coefficients and warrants further investigation. If further validated MDT-MOT might provide a useful mechanism for the routine assessment of MDMs by the local workforce to drive improvements in MDT performance. PMID:27084499

  10. Development and testing of the cancer multidisciplinary team meeting observational tool (MDT-MOT).

    PubMed

    Harris, Jenny; Taylor, Cath; Sevdalis, Nick; Jalil, Rozh; Green, James S A

    2016-06-01

    To develop a tool for independent observational assessment of cancer multidisciplinary team meetings (MDMs), and test criterion validity, inter-rater reliability/agreement and describe performance. Clinicians and experts in teamwork used a mixed-methods approach to develop and refine the tool. Study 1 observers rated pre-determined optimal/sub-optimal MDM film excerpts and Study 2 observers independently rated video-recordings of 10 MDMs. Study 2 included 10 cancer MDMs in England. Testing was undertaken by 13 health service staff and a clinical and non-clinical observer. None. Tool development, validity, reliability/agreement and variability in MDT performance. Study 1: Observers were able to discriminate between optimal and sub-optimal MDM performance (P ≤ 0.05). Study 2: Inter-rater reliability was good for 3/10 domains. Percentage of absolute agreement was high (≥80%) for 4/10 domains and percentage agreement within 1 point was high for 9/10 domains. Four MDTs performed well (scored 3+ in at least 8/10 domains), 5 MDTs performed well in 6-7 domains and 1 MDT performed well in only 4 domains. Leadership and chairing of the meeting, the organization and administration of the meeting, and clinical decision-making processes all varied significantly between MDMs (P ≤ 0.01). MDT-MOT demonstrated good criterion validity. Agreement between clinical and non-clinical observers (within one point on the scale) was high but this was inconsistent with reliability coefficients and warrants further investigation. If further validated MDT-MOT might provide a useful mechanism for the routine assessment of MDMs by the local workforce to drive improvements in MDT performance. © The Author 2016. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  11. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  12. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  13. Validation of the tool assessment of clinical education (AssCE): A study using Delphi method and clinical experts.

    PubMed

    Löfmark, Anna; Mårtensson, Gunilla

    2017-03-01

    The aim of the present study was to establish the validity of the tool Assessment of Clinical Education (AssCE). The tool is widely used in Sweden and some Nordic countries for assessing nursing students' performance in clinical education. It is important that the tools in use be subjected to regular audit and critical reviews. The validation process, performed in two stages, was concluded with a high level of congruence. In the first stage, Delphi technique was used to elaborate the AssCE tool using a group of 35 clinical nurse lecturers. After three rounds, we reached consensus. In the second stage, a group of 46 clinical nurse lecturers representing 12 universities in Sweden and Norway audited the revised version of the AssCE in relation to learning outcomes from the last clinical course at their respective institutions. Validation of the revised AssCE was established with high congruence between the factors in the AssCE and examined learning outcomes. The revised AssCE tool seems to meet its objective to be a validated assessment tool for use in clinical nursing education. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. The development and testing of a skin tear risk assessment tool.

    PubMed

    Newall, Nelly; Lewin, Gill F; Bulsara, Max K; Carville, Keryln J; Leslie, Gavin D; Roberts, Pam A

    2017-02-01

    The aim of the present study is to develop a reliable and valid skin tear risk assessment tool. The six characteristics identified in a previous case control study as constituting the best risk model for skin tear development were used to construct a risk assessment tool. The ability of the tool to predict skin tear development was then tested in a prospective study. Between August 2012 and September 2013, 1466 tertiary hospital patients were assessed at admission and followed up for 10 days to see if they developed a skin tear. The predictive validity of the tool was assessed using receiver operating characteristic (ROC) analysis. When the tool was found not to have performed as well as hoped, secondary analyses were performed to determine whether a potentially better performing risk model could be identified. The tool was found to have high sensitivity but low specificity and therefore have inadequate predictive validity. Secondary analysis of the combined data from this and the previous case control study identified an alternative better performing risk model. The tool developed and tested in this study was found to have inadequate predictive validity. The predictive validity of an alternative, more parsimonious model now needs to be tested. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  15. Long distance high power optical laser fiber break detection and continuity monitoring systems and methods

    DOEpatents

    Rinzler, Charles C.; Gray, William C.; Faircloth, Brian O.; Zediker, Mark S.

    2016-02-23

    A monitoring and detection system for use on high power laser systems, long distance high power laser systems and tools for performing high power laser operations. In particular, the monitoring and detection systems provide break detection and continuity protection for performing high power laser operations on, and in, remote and difficult to access locations.

  16. Situation Awareness and Workload Measures for SAFOR

    NASA Technical Reports Server (NTRS)

    DeMaio, Joe; Hart, Sandra G.; Allen, Ed (Technical Monitor)

    1999-01-01

    The present research was conducted in support of the NASA Safe All-Weather Flight Operations for Rotorcraft (SAFOR) program. The purpose of the work was to investigate the utility of two measurement tools developed by the British Defense Evaluation Research Agency. These tools were a subjective workload assessment scale, the DRA Workload Scale (DRAWS), and a situation awareness measurement tool in which the crews self-evaluation of performance is compared against actual performance. These two measurement tools were evaluated in the context of a test of an innovative approach to alerting the crew by way of a helmet mounted display. The DRAWS was found to be usable, but it offered no advantages over extant scales, and it had only limited resolution. The performance self-evaluation metric of situation awareness was found to be highly effective.

  17. A rotational ablation tool for calcified atherosclerotic plaque removal.

    PubMed

    Kim, Min-Hyeng; Kim, Hyung-Jung; Kim, Nicholas N; Yoon, Hae-Sung; Ahn, Sung-Hoon

    2011-12-01

    Atherosclerosis is a major cardiovascular disease involving accumulations of lipids, white blood cells, and other materials on the inside of artery walls. Since the calcification found in the advanced stage of atherosclerosis dramatically enhances the mechanical properties of the plaque, restoring the original lumen of the artery remains a challenge. High-speed rotational atherectomy, when performed with an ablating grinder to remove the plaque, produces much better results in the treatment of calcified plaque compared to other methods. However, the high-speed rotation of the Rotablator commercial rotational atherectomy device produces microcavitation, which should be avoided because of the serious complications it can cause. This research involves the development of a high-speed rotational ablation tool that does not generate microcavitation. It relies on surface modification to achieve the required surface roughness. The surface roughness of the tool for differential cutting was designed based on lubrication theory, and the surface of the tool was modified using Nd:YAG laser beam engraving. Electron microscope images and profiles indicated that the engraved surface of the tool had approximately 1 μm of root mean square surface roughness. The ablation experiment was performed on hydroxyapatite/polylactide composite with an elastic modulus similar to that of calcified plaque. In addition, differential cutting was verified on silicone rubber with an elastic modulus similar to that of a normal artery. The tool performance and reliability were evaluated by measuring the ablation force exerted, the size of the debris generated during ablation, and through visual inspection of the silicone rubber surface.

  18. High-performance scientific computing in the cloud

    NASA Astrophysics Data System (ADS)

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  19. Beyond Jeopardy and Lectures: Using "Microsoft PowerPoint" as a Game Design Tool to Teach Science

    ERIC Educational Resources Information Center

    Siko, Jason; Barbour, Michael; Toker, Sacip

    2011-01-01

    To date, research involving homemade PowerPoint games as an instructional tool has not shown statistically significant gains in student performance. This paper examines the results of a study comparing the performance of students in a high school chemistry course who created homemade PowerPoint games as a test review with the students who used a…

  20. RdTools: An Open Source Python Library for PV Degradation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deceglie, Michael G; Jordan, Dirk; Nag, Ambarish

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  1. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov Websites

    battery designers, developers, and manufacturers create affordable, high-performance lithium-ion (Li-ion Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided ) batteries for next-generation electric-drive vehicles (EDVs). An image of a simulation of a battery pack

  2. A tool for assessment of heart failure prescribing quality: A systematic review and meta-analysis.

    PubMed

    El Hadidi, Seif; Darweesh, Ebtissam; Byrne, Stephen; Bermingham, Margaret

    2018-04-16

    Heart failure (HF) guidelines aim to standardise patient care. Internationally, prescribing practice in HF may deviate from guidelines and so a standardised tool is required to assess prescribing quality. A systematic review and meta-analysis were performed to identify a quantitative tool for measuring adherence to HF guidelines and its clinical implications. Eleven electronic databases were searched to include studies reporting a comprehensive tool for measuring adherence to prescribing guidelines in HF patients aged ≥18 years. Qualitative studies or studies measuring prescription rates alone were excluded. Study quality was assessed using the Good ReseArch for Comparative Effectiveness Checklist. In total, 2455 studies were identified. Sixteen eligible full-text articles were included (n = 14 354 patients, mean age 69 ± 8 y). The Guideline Adherence Index (GAI), and its modified versions, was the most frequently cited tool (n = 13). Other tools identified were the Individualised Reconciled Evidence Recommendations, the Composite Heart Failure Performance, and the Heart Failure Scale. The meta-analysis included the GAI studies of good to high quality. The average GAI-3 was 62%. Compared to low GAI, high GAI patients had lower mortality rate (7.6% vs 33.9%) and lower rehospitalisation rates (23.5% vs 24.5%); both P ≤ .05. High GAI was associated with reduced risk of mortality (hazard ratio = 0.29, 95% confidence interval, 0.06-0.51) and rehospitalisation (hazard ratio = 0.64, 95% confidence interval, 0.41-1.00). No tool was used to improve prescribing quality. The GAI is the most frequently used tool to assess guideline adherence in HF. High GAI is associated with improved HF outcomes. Copyright © 2018 John Wiley & Sons, Ltd.

  3. MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.

    PubMed

    Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd

    2018-07-01

    Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.

  4. [Intelligent systems tools in the diagnosis of acute coronary syndromes: A systemic review].

    PubMed

    Sprockel, John; Tejeda, Miguel; Yate, José; Diaztagle, Juan; González, Enrique

    2017-03-27

    Acute myocardial infarction is the leading cause of non-communicable deaths worldwide. Its diagnosis is a highly complex task, for which modelling through automated methods has been attempted. A systematic review of the literature was performed on diagnostic tests that applied intelligent systems tools in the diagnosis of acute coronary syndromes. A systematic review of the literature is presented using Medline, Embase, Scopus, IEEE/IET Electronic Library, ISI Web of Science, Latindex and LILACS databases for articles that include the diagnostic evaluation of acute coronary syndromes using intelligent systems. The review process was conducted independently by 2 reviewers, and discrepancies were resolved through the participation of a third person. The operational characteristics of the studied tools were extracted. A total of 35 references met the inclusion criteria. In 22 (62.8%) cases, neural networks were used. In five studies, the performances of several intelligent systems tools were compared. Thirteen studies sought to perform diagnoses of all acute coronary syndromes, and in 22, only infarctions were studied. In 21 cases, clinical and electrocardiographic aspects were used as input data, and in 10, only electrocardiographic data were used. Most intelligent systems use the clinical context as a reference standard. High rates of diagnostic accuracy were found with better performance using neural networks and support vector machines, compared with statistical tools of pattern recognition and decision trees. Extensive evidence was found that shows that using intelligent systems tools achieves a greater degree of accuracy than some clinical algorithms or scales and, thus, should be considered appropriate tools for supporting diagnostic decisions of acute coronary syndromes. Copyright © 2017 Instituto Nacional de Cardiología Ignacio Chávez. Publicado por Masson Doyma México S.A. All rights reserved.

  5. Novel molecular diagnostic tools for malaria elimination: a review of options from the point of view of high-throughput and applicability in resource limited settings.

    PubMed

    Britton, Sumudu; Cheng, Qin; McCarthy, James S

    2016-02-16

    As malaria transmission continues to decrease, an increasing number of countries will enter pre-elimination and elimination. To interrupt transmission, changes in control strategies are likely to require more accurate identification of all carriers of Plasmodium parasites, both symptomatic and asymptomatic, using diagnostic tools that are highly sensitive, high throughput and with fast turnaround times preferably performed in local health service settings. Currently available immunochromatographic lateral flow rapid diagnostic tests and field microscopy are unlikely to consistently detect infections at parasite densities less than 100 parasites/µL making them insufficiently sensitive for detecting all carriers. Molecular diagnostic platforms, such as PCR and LAMP, are currently available in reference laboratories, but at a cost both financially and in turnaround time. This review describes the recent progress in developing molecular diagnostic tools in terms of their capacity for high throughput and potential for performance in non-reference laboratories for malaria elimination.

  6. GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.

    PubMed

    Subhash, Santhilal; Kanduri, Chandrasekhar

    2016-09-13

    High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.

  7. Team performance in resuscitation teams: Comparison and critique of two recently developed scoring tools☆

    PubMed Central

    McKay, Anthony; Walker, Susanna T.; Brett, Stephen J.; Vincent, Charles; Sevdalis, Nick

    2012-01-01

    Background and aim Following high profile errors resulting in patient harm and attracting negative publicity, the healthcare sector has begun to focus on training non-technical teamworking skills as one way of reducing the rate of adverse events. Within the area of resuscitation, two tools have been developed recently aiming to assess these skills – TEAM and OSCAR. The aims of the study reported here were:1.To determine the inter-rater reliability of the tools in assessing performance within the context of resuscitation.2.To correlate scores of the same resuscitation teams episodes using both tools, thereby determining their concurrent validity within the context of resuscitation.3.To carry out a critique of both tools and establish how best each one may be utilised. Methods The study consisted of two phases – reliability assessment; and content comparison, and correlation. Assessments were made by two resuscitation experts, who watched 24 pre-recorded resuscitation simulations, and independently rated team behaviours using both tools. The tools were critically appraised, and correlation between overall score surrogates was assessed. Results Both OSCAR and TEAM achieved high levels of inter-rater reliability (in the form of adequate intra-class coefficients) and minor significant differences between Wilcoxon tests. Comparison of the scores from both tools demonstrated a high degree of correlation (and hence concurrent validity). Finally, critique of each tool highlighted differences in length and complexity. Conclusion Both OSCAR and TEAM can be used to assess resuscitation teams in a simulated environment, with the tools correlating well with one another. We envisage a role for both tools – with TEAM giving a quick, global assessment of the team, but OSCAR enabling more detailed breakdown of the assessment, facilitating feedback, and identifying areas of weakness for future training. PMID:22561464

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunshah, R.F.; Shabaik, A.H.

    The process of Activated Reactive Evaporation is used to synthesize superhard materials like carbides, oxides, nitrides and ultrafine grain cermets. The deposits are characterized by hardness, microstructure, microprobe analysis for chemistry and lattice parameter measurements. The synthesis and characterization of TiC-Ni cermets and Al/sub 2/O/sub 3/ are given. High speed steel tool coated with TiC, TiC-Ni and TaC are tested for machining performance at different speeds and feeds. The machining evaluation and the selection of coatings is based on the rate of deterioration of the coating tool temperature, and cutting forces. Tool life tests show coated high speed steel toolsmore » having 150 to 300% improvement in tool life compared to uncoated tools. Variability in the quality of the ground edge on high speed steel inserts produce a great scatter in the machining evaluation data.« less

  9. The Impact of a Freshman Academy on Science Performance of First-Time Ninth-Grade Students at One Georgia High School

    ERIC Educational Resources Information Center

    Daniel, Vivian Summerour

    2011-01-01

    The purpose of this within-group experimental study was to find out to what extent ninth-grade students improved their science performance beyond their middle school science performance at one Georgia high school utilizing a freshman academy model. Freshman academies have been recognized as a useful tool for increasing academic performance among…

  10. Use of Continuous Integration Tools for Application Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vergara Larrea, Veronica G; Joubert, Wayne; Fuson, Christopher B

    High performance computing systems are becom- ing increasingly complex, both in node architecture and in the multiple layers of software stack required to compile and run applications. As a consequence, the likelihood is increasing for application performance regressions to occur as a result of routine upgrades of system software components which interact in complex ways. The purpose of this study is to evaluate the effectiveness of continuous integration tools for application performance monitoring on HPC systems. In addition, this paper also describes a prototype system for application perfor- mance monitoring based on Jenkins, a Java-based continuous integration tool. The monitoringmore » system described leverages several features in Jenkins to track application performance results over time. Preliminary results and lessons learned from monitoring applications on Cray systems at the Oak Ridge Leadership Computing Facility are presented.« less

  11. Interactive Tools for Measuring Visual Scanning Performance and Reaction Time

    PubMed Central

    Seeanner, Julia; Hennessy, Sarah; Manganelli, Joseph; Crisler, Matthew; Rosopa, Patrick; Jenkins, Casey; Anderson, Michael; Drouin, Nathalie; Belle, Leah; Truesdail, Constance; Tanner, Stephanie

    2017-01-01

    Occupational therapists are constantly searching for engaging, high-technology interactive tasks that provide immediate feedback to evaluate and train clients with visual scanning deficits. This study examined the relationship between two tools: the VISION COACH™ interactive light board and the Functional Object Detection© (FOD) Advanced driving simulator scenario. Fifty-four healthy drivers, ages 21–66 yr, were divided into three age groups. Participants performed braking response and visual target (E) detection tasks of the FOD Advanced driving scenario, followed by two sets of three trials using the VISION COACH Full Field 60 task. Results showed no significant effect of age on FOD Advanced performance but a significant effect of age on VISION COACH performance. Correlations showed that participants’ performance on both braking and E detection tasks were significantly positively correlated with performance on the VISION COACH (.37 < r < .40, p < .01). These tools provide new options for therapists. PMID:28218598

  12. High-performance computing — an overview

    NASA Astrophysics Data System (ADS)

    Marksteiner, Peter

    1996-08-01

    An overview of high-performance computing (HPC) is given. Different types of computer architectures used in HPC are discussed: vector supercomputers, high-performance RISC processors, various parallel computers like symmetric multiprocessors, workstation clusters, massively parallel processors. Software tools and programming techniques used in HPC are reviewed: vectorizing compilers, optimization and vector tuning, optimization for RISC processors; parallel programming techniques like shared-memory parallelism, message passing and data parallelism; and numerical libraries.

  13. Integrating Cache Performance Modeling and Tuning Support in Parallelization Tools

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    With the resurgence of distributed shared memory (DSM) systems based on cache-coherent Non Uniform Memory Access (ccNUMA) architectures and increasing disparity between memory and processors speeds, data locality overheads are becoming the greatest bottlenecks in the way of realizing potential high performance of these systems. While parallelization tools and compilers facilitate the users in porting their sequential applications to a DSM system, a lot of time and effort is needed to tune the memory performance of these applications to achieve reasonable speedup. In this paper, we show that integrating cache performance modeling and tuning support within a parallelization environment can alleviate this problem. The Cache Performance Modeling and Prediction Tool (CPMP), employs trace-driven simulation techniques without the overhead of generating and managing detailed address traces. CPMP predicts the cache performance impact of source code level "what-if" modifications in a program to assist a user in the tuning process. CPMP is built on top of a customized version of the Computer Aided Parallelization Tools (CAPTools) environment. Finally, we demonstrate how CPMP can be applied to tune a real Computational Fluid Dynamics (CFD) application.

  14. Installing and Setting Up the Git Software Tool on OS X | High-Performance

    Science.gov Websites

    Computing | NREL the Git Software Tool on OS X Installing and Setting Up the Git Software Tool on OS X Learn how to install the Git software tool on OS X for use with the Peregrine system. You can . Binary Installer for OS X - Easiest! You can download the latest version of git from http://git-scm.com

  15. Mixed Methods Design Study Investigating the Use of a Music Authentic Performance Assessment Tool by High School Band Directors to Measure Student Musical Growth

    ERIC Educational Resources Information Center

    Beason, Christine F.

    2017-01-01

    This research project was designed to determine if the Model Cornerstone Assessment for Performance, Proficient level, published by the National Association for Music Education would be an appropriate tool to use to demonstrate student growth as one element of teacher evaluations, specifically the T-TESS. This study focused on four main research…

  16. Objective Situation Awareness Measurement Based on Performance Self-Evaluation

    NASA Technical Reports Server (NTRS)

    DeMaio, Joe

    1998-01-01

    The research was conducted in support of the NASA Safe All-Weather Flight Operations for Rotorcraft (SAFOR) program. The purpose of the work was to investigate the utility of two measurement tools developed by the British Defense Evaluation Research Agency. These tools were a subjective workload assessment scale, the DRA Workload Scale and a situation awareness measurement tool. The situation awareness tool uses a comparison of the crew's self-evaluation of performance against actual performance in order to determine what information the crew attended to during the performance. These two measurement tools were evaluated in the context of a test of innovative approach to alerting the crew by way of a helmet mounted display. The situation assessment data are reported here. The performance self-evaluation metric of situation awareness was found to be highly effective. It was used to evaluate situation awareness on a tank reconnaissance task, a tactical navigation task, and a stylized task used to evaluated handling qualities. Using the self-evaluation metric, it was possible to evaluate situation awareness, without exact knowledge the relevant information in some cases and to identify information to which the crew attended or failed to attend in others.

  17. Research Results Of Stress-Strain State Of Cutting Tool When Aviation Materials Turning

    NASA Astrophysics Data System (ADS)

    Serebrennikova, A. G.; Nikolaeva, E. P.; Savilov, A. V.; Timofeev, S. A.; Pyatykh, A. S.

    2018-01-01

    Titanium alloys and stainless steels are hard-to-machine of all the machining types. Cutting edge state of turning tool after machining titanium and high-strength aluminium alloys and corrosion-resistant high-alloy steel has been studied. Cutting forces and chip contact arears with the rake surface of cutter has been measured. The relationship of cutting forces and residual stresses are shown. Cutting forces and residual stresses vs value of cutting tool rake angle relation were obtained. Measurements of residual stresses were performed by x-ray diffraction.

  18. Analyzing Reliability and Performance Trade-Offs of HLS-Based Designs in SRAM-Based FPGAs Under Soft Errors

    NASA Astrophysics Data System (ADS)

    Tambara, Lucas Antunes; Tonfat, Jorge; Santos, André; Kastensmidt, Fernanda Lima; Medina, Nilberto H.; Added, Nemitala; Aguiar, Vitor A. P.; Aguirre, Fernando; Silveira, Marcilei A. G.

    2017-02-01

    The increasing system complexity of FPGA-based hardware designs and shortening of time-to-market have motivated the adoption of new designing methodologies focused on addressing the current need for high-performance circuits. High-Level Synthesis (HLS) tools can generate Register Transfer Level (RTL) designs from high-level software programming languages. These tools have evolved significantly in recent years, providing optimized RTL designs, which can serve the needs of safety-critical applications that require both high performance and high reliability levels. However, a reliability evaluation of HLS-based designs under soft errors has not yet been presented. In this work, the trade-offs of different HLS-based designs in terms of reliability, resource utilization, and performance are investigated by analyzing their behavior under soft errors and comparing them to a standard processor-based implementation in an SRAM-based FPGA. Results obtained from fault injection campaigns and radiation experiments show that it is possible to increase the performance of a processor-based system up to 5,000 times by changing its architecture with a small impact in the cross section (increasing up to 8 times), and still increasing the Mean Workload Between Failures (MWBF) of the system.

  19. Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calyam, Prasad

    2014-09-15

    The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federationmore » policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zediker, Mark S.; Rinzler, Charles C.; Faircloth, Brian O.

    There is provided a system and apparatus for the transmission of high power laser energy over great distances without substantial power loss and without the presence of stimulated Raman scattering. There is further provided systems and optical fiber cable configurations and optical fiber structures for the delivering high power laser energy over great distances to a tool or surface to perform an operation or work with the tool or upon the surface.

  1. Supportive supervision and constructive relationships with healthcare workers support CHW performance: Use of a qualitative framework to evaluate CHW programming in Uganda.

    PubMed

    Ludwick, Teralynn; Turyakira, Eleanor; Kyomuhangi, Teddy; Manalili, Kimberly; Robinson, Sheila; Brenner, Jennifer L

    2018-02-13

    While evidence supports community health worker (CHW) capacity to improve maternal and newborn health in less-resourced countries, key implementation gaps remain. Tools for assessing CHW performance and evidence on what programmatic components affect performance are lacking. This study developed and tested a qualitative evaluative framework and tool to assess CHW team performance in a district program in rural Uganda. A new assessment framework was developed to collect and analyze qualitative evidence based on CHW perspectives on seven program components associated with effectiveness (selection; training; community embeddedness; peer support; supportive supervision; relationship with other healthcare workers; retention and incentive structures). Focus groups were conducted with four high/medium-performing CHW teams and four low-performing CHW teams selected through random, stratified sampling. Content analysis involved organizing focus group transcripts according to the seven program effectiveness components, and assigning scores to each component per focus group. Four components, 'supportive supervision', 'good relationships with other healthcare workers', 'peer support', and 'retention and incentive structures' received the lowest overall scores. Variances in scores between 'high'/'medium'- and 'low'-performing CHW teams were largest for 'supportive supervision' and 'good relationships with other healthcare workers.' Our analysis suggests that in the Bushenyi intervention context, CHW team performance is highly correlated with the quality of supervision and relationships with other healthcare workers. CHWs identified key performance-related issues of absentee supervisors, referral system challenges, and lack of engagement/respect by health workers. Other less-correlated program components warrant further study and may have been impacted by relatively consistent program implementation within our limited study area. Applying process-oriented measurement tools are needed to better understand CHW performance-related factors and build a supportive environment for CHW program effectiveness and sustainability. Findings from a qualitative, multi-component tool developed and applied in this study suggest that factors related to (1) supportive supervision and (2) relationships with other healthcare workers may be strongly associated with variances in performance outcomes within a program. Careful consideration of supervisory structure and health worker orientation during program implementation are among strategies proposed to increase CHW performance.

  2. The Case for High-Performance, Healthy Green Schools

    ERIC Educational Resources Information Center

    Carter, Leesa

    2011-01-01

    When trying to reach their sustainability goals, schools and school districts often run into obstacles, including financing, training, and implementation tools. Last fall, the U.S. Green Building Council-Georgia (USGBC-Georgia) launched its High Performance, Healthy Schools (HPHS) Program to help Georgia schools overcome those obstacles. By…

  3. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.

  4. An Approximate Ablative Thermal Protection System Sizing Tool for Entry System Design

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2005-01-01

    A computer tool to perform entry vehicle ablative thermal protection systems sizing has been developed. Two options for calculating the thermal response are incorporated into the tool. One, an industry-standard, high-fidelity ablation and thermal response program was integrated into the tool, making use of simulated trajectory data to calculate its boundary conditions at the ablating surface. Second, an approximate method that uses heat of ablation data to estimate heat shield recession during entry has been coupled to a one-dimensional finite-difference calculation that calculates the in-depth thermal response. The in-depth solution accounts for material decomposition, but does not account for pyrolysis gas energy absorption through the material. Engineering correlations are used to estimate stagnation point convective and radiative heating as a function of time. The sizing tool calculates recovery enthalpy, wall enthalpy, surface pressure, and heat transfer coefficient. Verification of this tool is performed by comparison to past thermal protection system sizings for the Mars Pathfinder and Stardust entry systems and calculations are performed for an Apollo capsule entering the atmosphere at lunar and Mars return speeds.

  5. An Approximate Ablative Thermal Protection System Sizing Tool for Entry System Design

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2006-01-01

    A computer tool to perform entry vehicle ablative thermal protection systems sizing has been developed. Two options for calculating the thermal response are incorporated into the tool. One, an industry-standard, high-fidelity ablation and thermal response program was integrated into the tool, making use of simulated trajectory data to calculate its boundary conditions at the ablating surface. Second, an approximate method that uses heat of ablation data to estimate heat shield recession during entry has been coupled to a one-dimensional finite-difference calculation that calculates the in-depth thermal response. The in-depth solution accounts for material decomposition, but does not account for pyrolysis gas energy absorption through the material. Engineering correlations are used to estimate stagnation point convective and radiative heating as a function of time. The sizing tool calculates recovery enthalpy, wall enthalpy, surface pressure, and heat transfer coefficient. Verification of this tool is performed by comparison to past thermal protection system sizings for the Mars Pathfinder and Stardust entry systems and calculations are performed for an Apollo capsule entering the atmosphere at lunar and Mars return speeds.

  6. Parallelization of NAS Benchmarks for Shared Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    This paper presents our experiences of parallelizing the sequential implementation of NAS benchmarks using compiler directives on SGI Origin2000 distributed shared memory (DSM) system. Porting existing applications to new high performance parallel and distributed computing platforms is a challenging task. Ideally, a user develops a sequential version of the application, leaving the task of porting to new generations of high performance computing systems to parallelization tools and compilers. Due to the simplicity of programming shared-memory multiprocessors, compiler developers have provided various facilities to allow the users to exploit parallelism. Native compilers on SGI Origin2000 support multiprocessing directives to allow users to exploit loop-level parallelism in their programs. Additionally, supporting tools can accomplish this process automatically and present the results of parallelization to the users. We experimented with these compiler directives and supporting tools by parallelizing sequential implementation of NAS benchmarks. Results reported in this paper indicate that with minimal effort, the performance gain is comparable with the hand-parallelized, carefully optimized, message-passing implementations of the same benchmarks.

  7. Capstone: A Geometry-Centric Platform to Enable Physics-Based Simulation and Design of Systems

    DTIC Science & Technology

    2015-10-05

    foundation for the air-vehicle early design tool DaVinci being developed by CREATETM-AV project to enable development of associative models of air...CREATETM-AV solvers Kestrel [11] and Helios [16,17]. Furthermore, it is the foundation for the CREATETM-AV’s DaVinci [9] tool that provides a... Tools and Environments (CREATETM) program [6] aimed at developing a suite of high- performance physics-based computational tools addressing the needs

  8. Short-term forecasting tools for agricultural nutrient management

    USDA-ARS?s Scientific Manuscript database

    The advent of real time/short term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high performance computing and hydrologic/climate modeling have enabled rapid dissemination of ...

  9. IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.

    This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less

  10. Optimization of a micro-scale, high throughput process development tool and the demonstration of comparable process performance and product quality with biopharmaceutical manufacturing processes.

    PubMed

    Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J

    2017-07-14

    In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Development of a measurement and feedback training tool for the arm strokes of high-performance luge athletes.

    PubMed

    Lembert, Sandra; Schachner, Otto; Raschner, Christian

    2011-12-01

    Previous studies have shown that the start plays a critical role in sliding events and explains more than 55% of the variance of the final time in luge. Experts evaluate the contribution of the arm strokes to be 23% of the total starting performance. The aim of the present study was to develop a measurement and feedback training tool (Speedpaddler) for the arm strokes of high-performance luge athletes. The construction is an aluminium alloy framework with a customary belt conveyor system, which is driven by two synchronized servo motors. Training is possible with constant speeds up to 12 m · s(-1) or several speed curves, which simulate the acceleration of different luge tracks. The construction facilitates variations in the inclination and speed of the conveyor belts and thereby the resistance and movement speed. If the athlete accelerates the conveyor belts during arm-paddling, the torque of the motors decreases. Torque measurements and high-speed video offer valuable insights into the several technique criteria. Comparisons of arm-paddle cycle durations on ice and on the Speedpaddler with 18 luge athletes (national team and juniors) showed no statistical differences. The Speedpaddler might be a useful tool to improve starting performance all year round.

  12. Study on electroplating technology of diamond tools for machining hard and brittle materials

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Chen, Jian Hua; Sun, Li Peng; Wang, Yue

    2016-10-01

    With the development of the high speed cutting, the ultra-precision machining and ultrasonic vibration technique in processing hard and brittle material , the requirement of cutting tools is becoming higher and higher. As electroplated diamond tools have distinct advantages, such as high adaptability, high durability, long service life and good dimensional stability, the cutting tools are effective and extensive used in grinding hard and brittle materials. In this paper, the coating structure of electroplating diamond tool is described. The electroplating process flow is presented, and the influence of pretreatment on the machining quality is analyzed. Through the experimental research and summary, the reasonable formula of the electrolyte, the electroplating technologic parameters and the suitable sanding method were determined. Meanwhile, the drilling experiment on glass-ceramic shows that the electroplating process can effectively improve the cutting performance of diamond tools. It has laid a good foundation for further improving the quality and efficiency of the machining of hard and brittle materials.

  13. Performance characteristics of five triage tools for major incidents involving traumatic injuries to children.

    PubMed

    Price, C L; Brace-McDonnell, S J; Stallard, N; Bleetman, A; Maconochie, I; Perkins, G D

    2016-05-01

    Context Triage tools are an essential component of the emergency response to a major incident. Although fortunately rare, mass casualty incidents involving children are possible which mandate reliable triage tools to determine the priority of treatment. To determine the performance characteristics of five major incident triage tools amongst paediatric casualties who have sustained traumatic injuries. Retrospective observational cohort study using data from 31,292 patients aged less than 16 years who sustained a traumatic injury. Data were obtained from the UK Trauma Audit and Research Network (TARN) database. Interventions Statistical evaluation of five triage tools (JumpSTART, START, CareFlight, Paediatric Triage Tape/Sieve and Triage Sort) to predict death or severe traumatic injury (injury severity score >15). Main outcome measures Performance characteristics of triage tools (sensitivity, specificity and level of agreement between triage tools) to identify patients at high risk of death or severe injury. Of the 31,292 cases, 1029 died (3.3%), 6842 (21.9%) had major trauma (defined by an injury severity score >15) and 14,711 (47%) were aged 8 years or younger. There was variation in the performance accuracy of the tools to predict major trauma or death (sensitivities ranging between 36.4 and 96.2%; specificities 66.0-89.8%). Performance characteristics varied with the age of the child. CareFlight had the best overall performance at predicting death, with the following sensitivity and specificity (95% CI) respectively: 95.3% (93.8-96.8) and 80.4% (80.0-80.9). JumpSTART was superior for the triaging of children under 8 years; sensitivity and specificity (95% CI) respectively: 86.3% (83.1-89.5) and 84.8% (84.2-85.5). The triage tools were generally better at identifying patients who would die than those with non-fatal severe injury. This statistical evaluation has demonstrated variability in the accuracy of triage tools at predicting outcomes for children who sustain traumatic injuries. No single tool performed consistently well across all evaluated scenarios. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

    NASA Astrophysics Data System (ADS)

    Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

    2015-12-01

    In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

  15. The effect of ergonomic laparoscopic tool handle design on performance and efficiency.

    PubMed

    Tung, Kryztopher D; Shorti, Rami M; Downey, Earl C; Bloswick, Donald S; Merryweather, Andrew S

    2015-09-01

    Many factors can affect a surgeon's performance in the operating room; these may include surgeon comfort, ergonomics of tool handle design, and fatigue. A laparoscopic tool handle designed with ergonomic considerations (pistol grip) was tested against a current market tool with a traditional pinch grip handle. The goal of this study is to quantify the impact ergonomic design considerations which have on surgeon performance. We hypothesized that there will be measurable differences between the efficiency while performing FLS surgical trainer tasks when using both tool handle designs in three categories: time to completion, technical skill, and subjective user ratings. The pistol grip incorporates an ergonomic interface intended to reduce contact stress points on the hand and fingers, promote a more neutral operating wrist posture, and reduce hand tremor and fatigue. The traditional pinch grip is a laparoscopic tool developed by Stryker Inc. widely used during minimal invasive surgery. Twenty-three (13 M, 10 F) participants with no existing upper extremity musculoskeletal disorders or experience performing laparoscopic procedures were selected to perform in this study. During a training session prior to testing, participants performed practice trials in a SAGES FLS trainer with both tools. During data collection, participants performed three evaluation tasks using both handle designs (order was randomized, and each trial completed three times). The tasks consisted of FLS peg transfer, cutting, and suturing tasks. Feedback from test participants indicated that they significantly preferred the ergonomic pistol grip in every category (p < 0.05); most notably, participants experienced greater degrees of discomfort in their hands after using the pinch grip tool. Furthermore, participants completed cutting and peg transfer tasks in a shorter time duration (p < 0.05) with the pistol grip than with the pinch grip design; there was no significant difference between completion times for the suturing task. Finally, there was no significant interaction between tool type and errors made during trials. There was a significant preference for as well as lower pain experienced during use of the pistol grip tool as seen from the survey feedback. Both evaluation tasks (cutting and peg transfer) were also completed significantly faster with the pistol grip tool. Finally, due to the high degree of variability in the error data, it was not possible to draw any meaningful conclusions about the effect of tool design on the number or degree of errors made.

  16. Risk assessment tools to identify women with increased risk of osteoporotic fracture: complexity or simplicity? A systematic review.

    PubMed

    Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim

    2013-08-01

    A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.

  17. Intelligent Monitoring? Assessing the ability of the Care Quality Commission's statistical surveillance tool to predict quality and prioritise NHS hospital inspections.

    PubMed

    Griffiths, Alex; Beaussier, Anne-Laure; Demeritt, David; Rothstein, Henry

    2017-02-01

    The Care Quality Commission (CQC) is responsible for ensuring the quality of the health and social care delivered by more than 30 000 registered providers in England. With only limited resources for conducting on-site inspections, the CQC has used statistical surveillance tools to help it identify which providers it should prioritise for inspection. In the face of planned funding cuts, the CQC plans to put more reliance on statistical surveillance tools to assess risks to quality and prioritise inspections accordingly. To evaluate the ability of the CQC's latest surveillance tool, Intelligent Monitoring (IM), to predict the quality of care provided by National Health Service (NHS) hospital trusts so that those at greatest risk of providing poor-quality care can be identified and targeted for inspection. The predictive ability of the IM tool is evaluated through regression analyses and χ 2 testing of the relationship between the quantitative risk score generated by the IM tool and the subsequent quality rating awarded following detailed on-site inspection by large expert teams of inspectors. First, the continuous risk scores generated by the CQC's IM statistical surveillance tool cannot predict inspection-based quality ratings of NHS hospital trusts (OR 0.38 (0.14 to 1.05) for Outstanding/Good, OR 0.94 (0.80 to -1.10) for Good/Requires improvement, and OR 0.90 (0.76 to 1.07) for Requires improvement/Inadequate). Second, the risk scores cannot be used more simply to distinguish the trusts performing poorly-those subsequently rated either 'Requires improvement' or 'Inadequate'-from the trusts performing well-those subsequently rated either 'Good' or 'Outstanding' (OR 1.07 (0.91 to 1.26)). Classifying CQC's risk bandings 1-3 as high risk and 4-6 as low risk, 11 of the high risk trusts were performing well and 43 of the low risk trusts were performing poorly, resulting in an overall accuracy rate of 47.6%. Third, the risk scores cannot be used even more simply to distinguish the worst performing trusts-those subsequently rated 'Inadequate'-from the remaining, better performing trusts (OR 1.11 (0.94 to 1.32)). Classifying CQC's risk banding 1 as high risk and 2-6 as low risk, the highest overall accuracy rate of 72.8% was achieved, but still only 6 of the 13 Inadequate trusts were correctly classified as being high risk. Since the IM statistical surveillance tool cannot predict the outcome of NHS hospital trust inspections, it cannot be used for prioritisation. A new approach to inspection planning is therefore required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. Using NERSC High-Performance Computing (HPC) systems for high-energy nuclear physics applications with ALICE

    NASA Astrophysics Data System (ADS)

    Fasel, Markus

    2016-10-01

    High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.

  19. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.

  20. Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations

    NASA Technical Reports Server (NTRS)

    Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy

    2011-01-01

    This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.

  1. Imaging Carbon Nanotubes in High Performance Polymer Composites via Magnetic Force Microscope

    NASA Technical Reports Server (NTRS)

    Lillehei, Peter T.; Park, Cheol; Rouse, Jason H.; Siochi, Emilie J.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Application of carbon nanotubes as reinforcement in structural composites is dependent on the efficient dispersion of the nanotubes in a high performance polymer matrix. The characterization of such dispersion is limited by the lack of available tools to visualize the quality of the matrix/carbon nanotube interaction. The work reported herein demonstrates the use of magnetic force microscopy (MFM) as a promising technique for characterizing the dispersion of nanotubes in a high performance polymer matrix.

  2. An Advanced, Interactive, High-Performance Liquid Chromatography Simulator and Instructor Resources

    ERIC Educational Resources Information Center

    Boswell, Paul G.; Stoll, Dwight R.; Carr, Peter W.; Nagel, Megan L.; Vitha, Mark F.; Mabbott, Gary A.

    2013-01-01

    High-performance liquid chromatography (HPLC) simulation software has long been recognized as an effective educational tool, yet many of the existing HPLC simulators are either too expensive, outdated, or lack many important features necessary to make them widely useful for educational purposes. Here, a free, open-source HPLC simulator is…

  3. Data Storage and Transfer | High-Performance Computing | NREL

    Science.gov Websites

    High-Performance Computing (HPC) systems. Photo of computer server wiring and lights, blurred to show data. WinSCP for Windows File Transfers Use to transfer files from a local computer to a remote computer. Robinhood for File Management Use this tool to manage your data files on Peregrine. Best

  4. A front-end automation tool supporting design, verification and reuse of SOC.

    PubMed

    Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing

    2004-09-01

    This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.

  5. Designing Liquid Rocket Engine Injectors for Performance, Stability, and Cost

    NASA Technical Reports Server (NTRS)

    Westra, Douglas G.; West, Jeffrey S.

    2014-01-01

    NASA is developing the Space Launch System (SLS) for crewed exploration missions beyond low Earth orbit. Marshall Space Flight Center (MSFC) is designing rocket engines for the SLS Advanced Booster (AB) concepts being developed to replace the Shuttle-derived solid rocket boosters. One AB concept uses large, Rocket-Propellant (RP)-fueled engines that pose significant design challenges. The injectors for these engines require high performance and stable operation while still meeting aggressive cost reduction goals for access to space. Historically, combustion stability problems have been a critical issue for such injector designs. Traditional, empirical injector design tools and methodologies, however, lack the ability to reliably predict complex injector dynamics that often lead to combustion stability. Reliance on these tools alone would likely result in an unaffordable test-fail-fix cycle for injector development. Recently at MSFC, a massively parallel computational fluid dynamics (CFD) program was successfully applied in the SLS AB injector design process. High-fidelity reacting flow simulations were conducted for both single-element and seven-element representations of the full-scale injector. Data from the CFD simulations was then used to significantly augment and improve the empirical design tools, resulting in a high-performance, stable injector design.

  6. Optimizing laser beam profiles using micro-lens arrays for efficient material processing: applications to solar cells

    NASA Astrophysics Data System (ADS)

    Hauschild, Dirk; Homburg, Oliver; Mitra, Thomas; Ivanenko, Mikhail; Jarczynski, Manfred; Meinschien, Jens; Bayer, Andreas; Lissotschenko, Vitalij

    2009-02-01

    High power laser sources are used in various production tools for microelectronic products and solar cells, including the applications annealing, lithography, edge isolation as well as dicing and patterning. Besides the right choice of the laser source suitable high performance optics for generating the appropriate beam profile and intensity distribution are of high importance for the right processing speed, quality and yield. For industrial applications equally important is an adequate understanding of the physics of the light-matter interaction behind the process. In advance simulations of the tool performance can minimize technical and financial risk as well as lead times for prototyping and introduction into series production. LIMO has developed its own software founded on the Maxwell equations taking into account all important physical aspects of the laser based process: the light source, the beam shaping optical system and the light-matter interaction. Based on this knowledge together with a unique free-form micro-lens array production technology and patented micro-optics beam shaping designs a number of novel solar cell production tool sub-systems have been built. The basic functionalities, design principles and performance results are presented with a special emphasis on resilience, cost reduction and process reliability.

  7. LittleQuickWarp: an ultrafast image warping tool.

    PubMed

    Qu, Lei; Peng, Hanchuan

    2015-02-01

    Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. A hardware acceleration based on high-level synthesis approach for glucose-insulin analysis

    NASA Astrophysics Data System (ADS)

    Daud, Nur Atikah Mohd; Mahmud, Farhanahani; Jabbar, Muhamad Hairol

    2017-01-01

    In this paper, the research is focusing on Type 1 Diabetes Mellitus (T1DM). Since this disease requires a full attention on the blood glucose concentration with the help of insulin injection, it is important to have a tool that able to predict that level when consume a certain amount of carbohydrate during meal time. Therefore, to make it realizable, a Hovorka model which is aiming towards T1DM is chosen in this research. A high-level language is chosen that is C++ to construct the mathematical model of the Hovorka model. Later, this constructed code is converted into intellectual property (IP) which is also known as a hardware accelerator by using of high-level synthesis (HLS) approach which able to improve in terms of design and performance for glucose-insulin analysis tool later as will be explained further in this paper. This is the first step in this research before implementing the design into system-on-chip (SoC) to achieve a high-performance system for the glucose-insulin analysis tool.

  9. Algorithmic Classification of Five Characteristic Types of Paraphasias.

    PubMed

    Fergadiotis, Gerasimos; Gorman, Kyle; Bedrick, Steven

    2016-12-01

    This study was intended to evaluate a series of algorithms developed to perform automatic classification of paraphasic errors (formal, semantic, mixed, neologistic, and unrelated errors). We analyzed 7,111 paraphasias from the Moss Aphasia Psycholinguistics Project Database (Mirman et al., 2010) and evaluated the classification accuracy of 3 automated tools. First, we used frequency norms from the SUBTLEXus database (Brysbaert & New, 2009) to differentiate nonword errors and real-word productions. Then we implemented a phonological-similarity algorithm to identify phonologically related real-word errors. Last, we assessed the performance of a semantic-similarity criterion that was based on word2vec (Mikolov, Yih, & Zweig, 2013). Overall, the algorithmic classification replicated human scoring for the major categories of paraphasias studied with high accuracy. The tool that was based on the SUBTLEXus frequency norms was more than 97% accurate in making lexicality judgments. The phonological-similarity criterion was approximately 91% accurate, and the overall classification accuracy of the semantic classifier ranged from 86% to 90%. Overall, the results highlight the potential of tools from the field of natural language processing for the development of highly reliable, cost-effective diagnostic tools suitable for collecting high-quality measurement data for research and clinical purposes.

  10. A statistical approach to detection of copy number variations in PCR-enriched targeted sequencing data.

    PubMed

    Demidov, German; Simakova, Tamara; Vnuchkova, Julia; Bragin, Anton

    2016-10-22

    Multiplex polymerase chain reaction (PCR) is a common enrichment technique for targeted massive parallel sequencing (MPS) protocols. MPS is widely used in biomedical research and clinical diagnostics as the fast and accurate tool for the detection of short genetic variations. However, identification of larger variations such as structure variants and copy number variations (CNV) is still being a challenge for targeted MPS. Some approaches and tools for structural variants detection were proposed, but they have limitations and often require datasets of certain type, size and expected number of amplicons affected by CNVs. In the paper, we describe novel algorithm for high-resolution germinal CNV detection in the PCR-enriched targeted sequencing data and present accompanying tool. We have developed a machine learning algorithm for the detection of large duplications and deletions in the targeted sequencing data generated with PCR-based enrichment step. We have performed verification studies and established the algorithm's sensitivity and specificity. We have compared developed tool with other available methods applicable for the described data and revealed its higher performance. We showed that our method has high specificity and sensitivity for high-resolution copy number detection in targeted sequencing data using large cohort of samples.

  11. Tools for 3D scientific visualization in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.

  12. A comprehensive evaluation of assembly scaffolding tools

    PubMed Central

    2014-01-01

    Background Genome assembly is typically a two-stage process: contig assembly followed by the use of paired sequencing reads to join contigs into scaffolds. Scaffolds are usually the focus of reported assembly statistics; longer scaffolds greatly facilitate the use of genome sequences in downstream analyses, and it is appealing to present larger numbers as metrics of assembly performance. However, scaffolds are highly prone to errors, especially when generated using short reads, which can directly result in inflated assembly statistics. Results Here we provide the first independent evaluation of scaffolding tools for second-generation sequencing data. We find large variations in the quality of results depending on the tool and dataset used. Even extremely simple test cases of perfect input, constructed to elucidate the behaviour of each algorithm, produced some surprising results. We further dissect the performance of the scaffolders using real and simulated sequencing data derived from the genomes of Staphylococcus aureus, Rhodobacter sphaeroides, Plasmodium falciparum and Homo sapiens. The results from simulated data are of high quality, with several of the tools producing perfect output. However, at least 10% of joins remains unidentified when using real data. Conclusions The scaffolders vary in their usability, speed and number of correct and missed joins made between contigs. Results from real data highlight opportunities for further improvements of the tools. Overall, SGA, SOPRA and SSPACE generally outperform the other tools on our datasets. However, the quality of the results is highly dependent on the read mapper and genome complexity. PMID:24581555

  13. Study of PVD AlCrN Coating for Reducing Carbide Cutting Tool Deterioration in the Machining of Titanium Alloys.

    PubMed

    Cadena, Natalia L; Cue-Sampedro, Rodrigo; Siller, Héctor R; Arizmendi-Morquecho, Ana M; Rivera-Solorio, Carlos I; Di-Nardo, Santiago

    2013-05-24

    The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum-chromium-nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating.

  14. Study of PVD AlCrN Coating for Reducing Carbide Cutting Tool Deterioration in the Machining of Titanium Alloys

    PubMed Central

    Cadena, Natalia L.; Cue-Sampedro, Rodrigo; Siller, Héctor R.; Arizmendi-Morquecho, Ana M.; Rivera-Solorio, Carlos I.; Di-Nardo, Santiago

    2013-01-01

    The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum–chromium–nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating. PMID:28809266

  15. GEANT4 and Secondary Particle Production

    NASA Technical Reports Server (NTRS)

    Patterson, Jeff

    2004-01-01

    GEANT 4 is a Monte Carlo tool set developed by the High Energy Physics Community (CERN, SLAC, etc) to perform simulations of complex particle detectors. GEANT4 is the ideal tool to study radiation transport and should be applied to space environments and the complex geometries of modern day spacecraft.

  16. Interactive Tools for Measuring Visual Scanning Performance and Reaction Time.

    PubMed

    Brooks, Johnell; Seeanner, Julia; Hennessy, Sarah; Manganelli, Joseph; Crisler, Matthew; Rosopa, Patrick; Jenkins, Casey; Anderson, Michael; Drouin, Nathalie; Belle, Leah; Truesdail, Constance; Tanner, Stephanie

    Occupational therapists are constantly searching for engaging, high-technology interactive tasks that provide immediate feedback to evaluate and train clients with visual scanning deficits. This study examined the relationship between two tools: the VISION COACH™ interactive light board and the Functional Object Detection © (FOD) Advanced driving simulator scenario. Fifty-four healthy drivers, ages 21-66 yr, were divided into three age groups. Participants performed braking response and visual target (E) detection tasks of the FOD Advanced driving scenario, followed by two sets of three trials using the VISION COACH Full Field 60 task. Results showed no significant effect of age on FOD Advanced performance but a significant effect of age on VISION COACH performance. Correlations showed that participants' performance on both braking and E detection tasks were significantly positively correlated with performance on the VISION COACH (.37 < r < .40, p < .01). These tools provide new options for therapists. Copyright © 2017 by the American Occupational Therapy Association, Inc.

  17. A Comparison of Automatic Parallelization Tools/Compilers on the SGI Origin 2000 Using the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry

    1998-01-01

    Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Svetlana Shasharina

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  19. Adhesive bonded structural repair. II - Surface preparation procedures, tools, equipment and facilities

    NASA Astrophysics Data System (ADS)

    Wegman, Raymond F.; Tullos, Thomas R.

    1993-10-01

    A development status report is presented on the surface preparation procedures, tools, equipment, and facilities used in adhesively-bonded repair of aerospace and similar high-performance structures. These methods extend to both metallic and polymeric surfaces. Attention is given to the phos-anodize containment system, paint removal processes, tools for cutting composite prepreg and fabric materials, autoclaves, curing ovens, vacuum bagging, and controlled atmospheres.

  20. Development of Plant Control Diagnosis Technology and Increasing Its Applications

    NASA Astrophysics Data System (ADS)

    Kugemoto, Hidekazu; Yoshimura, Satoshi; Hashizume, Satoru; Kageyama, Takashi; Yamamoto, Toru

    A plant control diagnosis technology was developed to improve the performance of plant-wide control and maintain high productivity of plants. The control performance diagnosis system containing this technology picks out the poor performance loop, analyzes the cause, and outputs the result on the Web page. Meanwhile, the PID tuning tool is used to tune extracted loops from the control performance diagnosis system. It has an advantage of tuning safely without process changes. These systems are powerful tools to do Kaizen (continuous improvement efforts) step by step, coordinating with the operator. This paper describes a practical technique regarding the diagnosis system and its industrial applications.

  1. New tools using the hardware performance monitor to help users tune programs on the Cray X-MP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.; Rudsinski, L.; Doak, J.

    1991-09-25

    The performance of a Cray system is highly dependent on the tuning techniques used by individuals on their codes. Many of our users were not taking advantage of the tuning tools that allow them to monitor their own programs by using the Hardware Performance Monitor (HPM). We therefore modified UNICOS to collect HPM data for all processes and to report Mflop ratings based on users, programs, and time used. Our tuning efforts are now being focused on the users and programs that have the best potential for performance improvements. These modifications and some of the more striking performance improvements aremore » described.« less

  2. Effect of cutting fluids and cutting conditions on surface integrity and tool wear in turning of Inconel 713C

    NASA Astrophysics Data System (ADS)

    Hikiji, R.

    2018-01-01

    The trend toward downsizing of engines helps to increase the number of turbochargers around Europe. As for the turbocharger, the temperature of the exhaust gas is so high that the parts made of nickel base super alloy Inconel 713C are used as high temperature strength metals. External turning of Inconel 713C which is used as the actual automotive parts was carried out. The effect of the cutting fluids and cutting conditions on the surface integrity and tool wear was investigated, considering global environment and cost performance. As a result, in the range of the cutting conditions used this time, when the depth of cut was small, the good surface integrity and tool life were obtained. However, in the case of the large corner radius, it was found that the more the cutting length increased, the more the tool wear increased. When the cutting length is so large, the surface integrity and tool life got worse. As for the cutting fluids, it was found that the synthetic type showed better performance in the surface integrity and tool life than the conventional emulsion. However, it was clear that the large corner radius made the surface roughness and tool life good, but it affected the size error etc. in machining the workpiece held in a cantilever style.

  3. Relations between mental health team characteristics and work role performance.

    PubMed

    Fleury, Marie-Josée; Grenier, Guy; Bamvita, Jean-Marie; Farand, Lambert

    2017-01-01

    Effective mental health care requires a high performing, interprofessional team. Among 79 mental health teams in Quebec (Canada), this exploratory study aims to 1) determine the association between work role performance and a wide range of variables related to team effectiveness according to the literature, and to 2) using structural equation modelling, assess the covariance between each of these variables as well as the correlation with other exogenous variables. Work role performance was measured with an adapted version of a work role questionnaire. Various independent variables including team manager characteristics, user characteristics, team profiles, clinical activities, organizational culture, network integration strategies and frequency/satisfaction of interactions with other teams or services were analyzed under the structural equation model. The later provided a good fit with the data. Frequent use of standardized procedures and evaluation tools (e.g. screening and assessment tools for mental health disorders) and team manager seniority exerted the most direct effect on work role performance. While network integration strategies had little effect on work role performance, there was a high covariance between this variable and those directly affecting work role performance among mental health teams. The results suggest that the mental healthcare system should apply standardized procedures and evaluation tools and, to a lesser extent, clinical approaches to improve work role performance in mental health teams. Overall, a more systematic implementation of network integration strategies may contribute to improved work role performance in mental health care.

  4. Relations between mental health team characteristics and work role performance

    PubMed Central

    Grenier, Guy; Bamvita, Jean-Marie; Farand, Lambert

    2017-01-01

    Effective mental health care requires a high performing, interprofessional team. Among 79 mental health teams in Quebec (Canada), this exploratory study aims to 1) determine the association between work role performance and a wide range of variables related to team effectiveness according to the literature, and to 2) using structural equation modelling, assess the covariance between each of these variables as well as the correlation with other exogenous variables. Work role performance was measured with an adapted version of a work role questionnaire. Various independent variables including team manager characteristics, user characteristics, team profiles, clinical activities, organizational culture, network integration strategies and frequency/satisfaction of interactions with other teams or services were analyzed under the structural equation model. The later provided a good fit with the data. Frequent use of standardized procedures and evaluation tools (e.g. screening and assessment tools for mental health disorders) and team manager seniority exerted the most direct effect on work role performance. While network integration strategies had little effect on work role performance, there was a high covariance between this variable and those directly affecting work role performance among mental health teams. The results suggest that the mental healthcare system should apply standardized procedures and evaluation tools and, to a lesser extent, clinical approaches to improve work role performance in mental health teams. Overall, a more systematic implementation of network integration strategies may contribute to improved work role performance in mental health care. PMID:28991923

  5. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.

  6. Nutritional screening in hospitalized pediatric patients: a systematic review.

    PubMed

    Teixeira, Adriana Fonseca; Viana, Kátia Danielle Araújo Lourenço

    2016-01-01

    This systematic review aimed to verify the available scientific evidence on the clinical performance and diagnostic accuracy of nutritional screening tools in hospitalized pediatric patients. A search was performed in the Medline (National Library of Medicine United States), LILACS (Latin American and Caribbean Health Sciences), PubMed (US National Library of Medicine National Institutes of Health), in the SCIELO (Scientific Electronic Library Online), through CAPES portal (Coordenação de Aperfeiçoamento de Pessoal de Nível Superior), bases Scopus e Web of Science. The descriptors used in accordance with the Descriptors in Health Sciences (DeCS)/Medical Subject Headings (MeSH) list were "malnutrition", "screening", and "pediatrics", as well as the equivalent words in Portuguese. The authors identified 270 articles published between 2004 and 2014. After applying the selection criteria, 35 were analyzed in full and eight articles were included in the systematic review. We evaluated the methodological quality of the studies using the Quality Assessment of Diagnostic Accuracy Studies (QUADAS). Five nutritional screening tools in pediatrics were identified. Among these, the Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP) showed high sensitivity, almost perfect inter-rater agreement and between the screening and the reference standard; the Screening Tool Risk on Nutritional Status and Growth (STRONGkids) showed high sensitivity, lower percentage of specificity, substantial intra-rater agreement, and ease of use in clinical practice. The studies included in this systematic review showed good performance of the nutritional screening tools in pediatrics, especially STRONGkids and STAMP. The authors emphasize the need to perform for more studies in this area. Only one tool was translated and adapted to the Brazilian pediatric population, and it is essential to carry out studies of tool adaptation and validation for this population. Copyright © 2016 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  7. Assessing does not mean threatening: the purpose of assessment as a key determinant of girls' and boys' performance in a science class.

    PubMed

    Souchal, Carine; Toczek, Marie-Christine; Darnon, Céline; Smeding, Annique; Butera, Fabrizio; Martinot, Delphine

    2014-03-01

    Is it possible to reach performance equality between boys and girls in a science class? Given the stereotypes targeting their groups in scientific domains, diagnostic contexts generally lower girls' performance and non-diagnostic contexts may harm boys' performance. The present study tested the effectiveness of a mastery-oriented assessment, allowing both boys and girls to perform at an optimal level in a science class. Participants were 120 boys and 72 girls (all high-school students). Participants attended a science lesson while expecting a performance-oriented assessment (i.e., an assessment designed to compare and select students), a mastery-oriented assessment (i.e., an assessment designed to help students in their learning), or no assessment of this lesson. In the mastery-oriented assessment condition, both boys and girls performed at a similarly high level, whereas the performance-oriented assessment condition reduced girls' performance and the no-assessment condition reduced boys' performance. One way to increase girls' performance on a science test without harming boys' performance is to present assessment as a tool for improving mastery rather than as a tool for comparing performances. © 2013 The British Psychological Society.

  8. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  9. Validity and applicability of a video-based animated tool to assess mobility in elderly Latin American populations.

    PubMed

    Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria

    2014-10-01

    To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90-0.97) in Brazil and 0.81 (95% confidence interval 0.66-0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective performance measures. © 2013 Japan Geriatrics Society.

  10. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  11. mdtmFTP and its evaluation on ESNET SDN testbed

    DOE PAGES

    Zhang, Liang; Wu, Wenji; DeMar, Phil; ...

    2017-04-21

    In this paper, to address the high-performance challenges of data transfer in the big data era, we are developing and implementing mdtmFTP: a high-performance data transfer tool for big data. mdtmFTP has four salient features. First, it adopts an I/O centric architecture to execute data transfer tasks. Second, it more efficiently utilizes the underlying multicore platform through optimized thread scheduling. Third, it implements a large virtual file mechanism to address the lots-of-small-files (LOSF) problem. In conclusion, mdtmFTP integrates multiple optimization mechanisms, including–zero copy, asynchronous I/O, pipelining, batch processing, and pre-allocated buffer pools–to enhance performance. mdtmFTP has been extensively tested andmore » evaluated within the ESNET 100G testbed. Evaluations show that mdtmFTP can achieve higher performance than existing data transfer tools, such as GridFTP, FDT, and BBCP.« less

  12. PyCoTools: A Python Toolbox for COPASI.

    PubMed

    Welsh, Ciaran M; Fullard, Nicola; Proctor, Carole J; Martinez-Guimera, Alvaro; Isfort, Robert J; Bascom, Charles C; Tasseff, Ryan; Przyborski, Stefan A; Shanley, Daryl P

    2018-05-22

    COPASI is an open source software package for constructing, simulating and analysing dynamic models of biochemical networks. COPASI is primarily intended to be used with a graphical user interface but often it is desirable to be able to access COPASI features programmatically, with a high level interface. PyCoTools is a Python package aimed at providing a high level interface to COPASI tasks with an emphasis on model calibration. PyCoTools enables the construction of COPASI models and the execution of a subset of COPASI tasks including time courses, parameter scans and parameter estimations. Additional 'composite' tasks which use COPASI tasks as building blocks are available for increasing parameter estimation throughput, performing identifiability analysis and performing model selection. PyCoTools supports exploratory data analysis on parameter estimation data to assist with troubleshooting model calibrations. We demonstrate PyCoTools by posing a model selection problem designed to show case PyCoTools within a realistic scenario. The aim of the model selection problem is to test the feasibility of three alternative hypotheses in explaining experimental data derived from neonatal dermal fibroblasts in response to TGF-β over time. PyCoTools is used to critically analyse the parameter estimations and propose strategies for model improvement. PyCoTools can be downloaded from the Python Package Index (PyPI) using the command 'pip install pycotools' or directly from GitHub (https://github.com/CiaranWelsh/pycotools). Documentation at http://pycotools.readthedocs.io. Supplementary data are available at Bioinformatics.

  13. S-MART, a software toolbox to aid RNA-Seq data analysis.

    PubMed

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci.

  14. S-MART, A Software Toolbox to Aid RNA-seq Data Analysis

    PubMed Central

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci. PMID:21998740

  15. Improving smoothing efficiency of rigid conformal polishing tool using time-dependent smoothing evaluation model

    NASA Astrophysics Data System (ADS)

    Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng

    2017-06-01

    A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o

  16. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    NASA Astrophysics Data System (ADS)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  17. Metabolic profiling of Hoodia, Chamomile, Terminalia Species and evaluation of commercial preparations using Ultra-High Performance Quadrupole Time of Flight-Mass Spectrometry

    USDA-ARS?s Scientific Manuscript database

    Ultra-High Performance-Quadrupole Time of Flight Mass Spectrometr(UHPLC-QToF-MS)profiling has become an impattant tool for identification of marker compounds and generation of metabolic patterns that could be interrogated using chemometric modeling software. Chemometric approaches can be used to ana...

  18. Overcoming redundancies in bedside nursing assessments by validating a parsimonious meta-tool: findings from a methodological exercise study.

    PubMed

    Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca

    2016-10-01

    There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.

  19. Developing 21st century skills through the use of student personal learning networks

    NASA Astrophysics Data System (ADS)

    Miller, Robert D.

    This research was conducted to study the development of 21st century communication, collaboration, and digital literacy skills of students at the high school level through the use of online social network tools. The importance of this study was based on evidence high school and college students are not graduating with the requisite skills of communication, collaboration, and digital literacy skills yet employers see these skills important to the success of their employees. The challenge addressed through this study was how high schools can integrate social network tools into traditional learning environments to foster the development of these 21st century skills. A qualitative research study was completed through the use of case study. One high school class in a suburban high performing town in Connecticut was selected as the research site and the sample population of eleven student participants engaged in two sets of interviews and learned through the use social network tools for one semester of the school year. The primary social network tools used were Facebook, Diigo, Google Sites, Google Docs, and Twitter. The data collected and analyzed partially supported the transfer of the theory of connectivism at the high school level. The students actively engaged in collaborative learning and research. Key results indicated a heightened engagement in learning, the development of collaborative learning and research skills, and a greater understanding of how to use social network tools for effective public communication. The use of social network tools with high school students was a positive experience that led to an increased awareness of the students as to the benefits social network tools have as a learning tool. The data supported the continued use of social network tools to develop 21st century communication, collaboration, and digital literacy skills. Future research in this area may explore emerging social network tools as well as the long term impact these tools have on the development of lifelong learning skills and quantitative data linked to student learning.

  20. Long range science scheduling for the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Miller, Glenn; Johnston, Mark

    1991-01-01

    Observations with NASA's Hubble Space Telescope (HST) are scheduled with the assistance of a long-range scheduling system (SPIKE) that was developed using artificial intelligence techniques. In earlier papers, the system architecture and the constraint representation and propagation mechanisms were described. The development of high-level automated scheduling tools, including tools based on constraint satisfaction techniques and neural networks is described. The performance of these tools in scheduling HST observations is discussed.

  1. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  2. Evaluation of the Langmuir model in the Soil and Water Assessment Tool for high soil phosphorus condition

    USDA-ARS?s Scientific Manuscript database

    Phosphorus adsorption by a water treatment residual was tested through Langmuir and linear sorption isotherms and applied in the Soil and Water Assessment Tool (SWAT). The objective of this study was to use laboratory and greenhouse experimental phosphorus data to evaluate the performance of a modi...

  3. Multidisciplinary Analysis of a Hypersonic Engine

    NASA Technical Reports Server (NTRS)

    Suresh, Ambady; Stewart, Mark

    2003-01-01

    The objective is to develop high fidelity tools that can influence ISTAR design In particular, tools for coupling Fluid-Thermal-Structural simulations RBCC/TBCC designers carefully balance aerodynamic, thermal, weight, & structural considerations; consistent multidisciplinary solutions reveal details (at modest cost) At Scram mode design point, simulations give details of inlet & combustor performance, thermal loads, structural deflections.

  4. Tribology in secondary wood machining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, P.L.; Hawthorne, H.M.; Andiappan, J.

    Secondary wood manufacturing covers a wide range of products from furniture, cabinets, doors and windows, to musical instruments. Many of these are now mass produced in sophisticated, high speed numerical controlled machines. The performance and the reliability of the tools are key to an efficient and economical manufacturing process as well as to the quality of the finished products. A program concerned with three aspects of tribology of wood machining, namely, tool wear, tool-wood friction characteristics and wood surface quality characterization, was set up in the Integrated Manufacturing Technologies Institute (IMTI) of the National Research Council of Canada. The studiesmore » include friction and wear mechanism identification and modeling, wear performance of surface-engineered tool materials, friction-induced vibration and cutting efficiency, and the influence of wear and friction on finished products. This research program underlines the importance of tribology in secondary wood manufacturing and at the same time adds new challenges to tribology research since wood is a complex, heterogeneous, material and its behavior during machining is highly sensitive to the surrounding environments and to the moisture content in the work piece.« less

  5. StagBL : A Scalable, Portable, High-Performance Discretization and Solver Layer for Geodynamic Simulation

    NASA Astrophysics Data System (ADS)

    Sanan, P.; Tackley, P. J.; Gerya, T.; Kaus, B. J. P.; May, D.

    2017-12-01

    StagBL is an open-source parallel solver and discretization library for geodynamic simulation,encapsulating and optimizing operations essential to staggered-grid finite volume Stokes flow solvers.It provides a parallel staggered-grid abstraction with a high-level interface in C and Fortran.On top of this abstraction, tools are available to define boundary conditions and interact with particle systems.Tools and examples to efficiently solve Stokes systems defined on the grid are provided in small (direct solver), medium (simple preconditioners), and large (block factorization and multigrid) model regimes.By working directly with leading application codes (StagYY, I3ELVIS, and LaMEM) and providing an API and examples to integrate with others, StagBL aims to become a community tool supplying scalable, portable, reproducible performance toward novel science in regional- and planet-scale geodynamics and planetary science.By implementing kernels used by many research groups beneath a uniform abstraction layer, the library will enable optimization for modern hardware, thus reducing community barriers to large- or extreme-scale parallel simulation on modern architectures. In particular, the library will include CPU-, Manycore-, and GPU-optimized variants of matrix-free operators and multigrid components.The common layer provides a framework upon which to introduce innovative new tools.StagBL will leverage p4est to provide distributed adaptive meshes, and incorporate a multigrid convergence analysis tool.These options, in addition to a wealth of solver options provided by an interface to PETSc, will make the most modern solution techniques available from a common interface. StagBL in turn provides a PETSc interface, DMStag, to its central staggered grid abstraction.We present public version 0.5 of StagBL, including preliminary integration with application codes and demonstrations with its own demonstration application, StagBLDemo. Central to StagBL is the notion of an uninterrupted pipeline from toy/teaching codes to high-performance, extreme-scale solves. StagBLDemo replicates the functionality of an advanced MATLAB-style regional geodynamics code, thus providing users with a concrete procedure to exceed the performance and scalability limitations of smaller-scale tools.

  6. Impact of design-parameters on the optical performance of a high-power adaptive mirror

    NASA Astrophysics Data System (ADS)

    Koek, Wouter D.; Nijkerk, David; Smeltink, Jeroen A.; van den Dool, Teun C.; van Zwet, Erwin J.; van Baars, Gregor E.

    2017-02-01

    TNO is developing a High Power Adaptive Mirror (HPAM) to be used in the CO2 laser beam path of an Extreme Ultra- Violet (EUV) light source for next-generation lithography. In this paper we report on a developed methodology, and the necessary simulation tools, to assess the performance and associated sensitivities of this deformable mirror. Our analyses show that, given the current limited insight concerning the process window of EUV generation, the HPAM module should have an actuator pitch of <= 4 mm. Furthermore we have modelled the sensitivity of performance with respect to dimpling and actuator noise. For example, for a deformable mirror with an actuator pitch of 4 mm, and if the associated performance impact is to be limited to smaller than 5%, the actuator noise should be smaller than 45 nm (rms). Our tools assist in the detailed design process by assessing the performance impact of various design choices, including for example those that affect the shape and spectral content of the influence function.

  7. Developing an Objective Structured Assessment of Technical Skills for Laparoscopic Suturing and Intracorporeal Knot Tying.

    PubMed

    Chang, Olivia H; King, Louise P; Modest, Anna M; Hur, Hye-Chun

    2016-01-01

    To develop a teaching and assessment tool for laparoscopic suturing and intracorporeal knot tying. We designed an Objective Structured Assessment of Technical Skills (OSATS) tool that includes a procedure-specific checklist (PSC) and global rating scale (GRS) to assess laparoscopic suturing and intracorporeal knot-tying performance. Obstetrics and Gynecology residents at our institution were videotaped while performing a laparoscopic suturing and intracorporeal knot-tying task at a surgical simulation workshop. A total of 2 expert reviewers assessed resident performance using the OSATS tool during live performance and 1 month later using the videotaped recordings. OSATS scores were analyzed using the Wilcoxon rank-sum test. Data are presented as median scores (interquartile range [IQR]). Intrarater and interrater reliabilities were assessed using a Spearman correlation and are presented as an r correlation coefficient and p value. An r ≥ 0.8 was considered as a high correlation. After testing, we received feedback from residents and faculty to improve the OSATS tool as part of an iterative design process. In all, 14 of 21 residents (66.7%) completed the study, with 9 junior residents and 5 senior residents. Junior residents had a lower score on the PSC than senior residents did; however, this was not statistically significant (median = 6.0 [IQR: 4.0-10.0] and median = 13.0 [IQR: 10.0-13.0]; p = 0.09). There was excellent intrarater reliability with our OSATS tool (for PSC component, r = 0.88 for Rater 1 and 0.93 for Rater 2, both p < 0.0001; for GRS component, r = 0.85 for Rater 1 and 0.88 for Rater 2, both p ≤ 0.0002). The PSC also has high interrater reliability during live evaluation (r = 0.92; p < 0.0001), and during the videotape scoring with r = 0.77 (p = 0.001). Our OSATS tool may be a useful assessment and teaching tool for laparoscopic suturing and intracorporeal knot-tying skills. Overall, good intrarater reliability was demonstrated, suggesting that this tool may be useful for longitudinal assessment of surgical skills. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  9. Performance evaluation of inpatient service in Beijing: a horizontal comparison with risk adjustment based on Diagnosis Related Groups.

    PubMed

    Jian, Weiyan; Huang, Yinmin; Hu, Mu; Zhang, Xiumei

    2009-04-30

    The medical performance evaluation, which provides a basis for rational decision-making, is an important part of medical service research. Current progress with health services reform in China is far from satisfactory, without sufficient regulation. To achieve better progress, an effective tool for evaluating medical performance needs to be established. In view of this, this study attempted to develop such a tool appropriate for the Chinese context. Data was collected from the front pages of medical records (FPMR) of all large general public hospitals (21 hospitals) in the third and fourth quarter of 2007. Locally developed Diagnosis Related Groups (DRGs) were introduced as a tool for risk adjustment and performance evaluation indicators were established: Charge Efficiency Index (CEI), Time Efficiency Index (TEI) and inpatient mortality of low-risk group cases (IMLRG), to reflect respectively work efficiency and medical service quality. Using these indicators, the inpatient services' performance was horizontally compared among hospitals. Case-mix Index (CMI) was used to adjust efficiency indices and then produce adjusted CEI (aCEI) and adjusted TEI (aTEI). Poisson distribution analysis was used to test the statistical significance of the IMLRG differences between different hospitals. Using the aCEI, aTEI and IMLRG scores for the 21 hospitals, Hospital A and C had relatively good overall performance because their medical charges were lower, LOS shorter and IMLRG smaller. The performance of Hospital P and Q was the worst due to their relatively high charge level, long LOS and high IMLRG. Various performance problems also existed in the other hospitals. It is possible to develop an accurate and easy to run performance evaluation system using Case-Mix as the tool for risk adjustment, choosing indicators close to consumers and managers, and utilizing routine report forms as the basic information source. To keep such a system running effectively, it is necessary to improve the reliability of clinical information and the risk-adjustment ability of Case-Mix.

  10. Fundamental Aeronautics Program: Overview of Project Work in Supersonic Cruise Efficiency

    NASA Technical Reports Server (NTRS)

    Castner, Raymond

    2011-01-01

    The Supersonics Project, part of NASA?s Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2011) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  11. Fundamental Aeronautics Program: Overview of Propulsion Work in the Supersonic Cruise Efficiency Technical Challenge

    NASA Technical Reports Server (NTRS)

    Castner, Ray

    2012-01-01

    The Supersonics Project, part of NASA's Fundamental Aeronautics Program, contains a number of technical challenge areas which include sonic boom community response, airport noise, high altitude emissions, cruise efficiency, light weight durable engines/airframes, and integrated multi-discipline system design. This presentation provides an overview of the current (2012) activities in the supersonic cruise efficiency technical challenge, and is focused specifically on propulsion technologies. The intent is to develop and validate high-performance supersonic inlet and nozzle technologies. Additional work is planned for design and analysis tools for highly-integrated low-noise, low-boom applications. If successful, the payoffs include improved technologies and tools for optimized propulsion systems, propulsion technologies for a minimized sonic boom signature, and a balanced approach to meeting efficiency and community noise goals. In this propulsion area, the work is divided into advanced supersonic inlet concepts, advanced supersonic nozzle concepts, low fidelity computational tool development, high fidelity computational tools, and improved sensors and measurement capability. The current work in each area is summarized.

  12. Which species? A decision-support tool to guide plant selection in stormwater biofilters

    NASA Astrophysics Data System (ADS)

    Payne, Emily G. I.; Pham, Tracey; Deletic, Ana; Hatt, Belinda E.; Cook, Perran L. M.; Fletcher, Tim D.

    2018-03-01

    Plant species are diverse in form, function and environmental response. This provides enormous potential for designing nature-based stormwater treatment technologies, such as biofiltration systems. However, species can vary dramatically in their pollutant-removal performance, particularly for nitrogen removal. Currently, there is a lack of information on how to efficiently select from the vast palette of species. This study aimed to identify plant traits beneficial to performance and create a decision-support tool to screen species for further testing. A laboratory experiment using 220 biofilter columns paired plant morphological characteristics with nitrogen removal and water loss for 20 Australian native species and two lawn grasses. Testing was undertaken during wet and dry conditions, for two biofilter designs (saturated zone and free-draining). An extensive root system and high total biomass were critical to the effective removal of total nitrogen (TN) and nitrate (NO3-), driven by high nitrogen assimilation. The same characteristics were key to performance under dry conditions, and were associated with high water use for Australian native plants; linking assimilation and transpiration. The decision-support tool uses these scientific relationships and readily-available information to identify the morphology, natural distribution and stress tolerances likely to be good predictors of plant nitrogen and water uptake.

  13. Screening for non-alcoholic fatty liver disease in children: do guidelines provide enough guidance?

    PubMed

    Koot, B G P; Nobili, V

    2017-09-01

    Non-alcoholic fatty liver disease (NAFLD) is the most common chronic liver disease in the industrialized world in children. Its high prevalence and important health risks make NAFLD highly suitable for screening. In practice, screening is widely, albeit not consistently, performed. To review the recommendations on screening for NAFLD in children. Recommendations on screening were reviewed from major paediatric obesity guidelines and NAFLD guidelines. A literature overview is provided on open questions and controversies. Screening for NAFLD is advocated in all obesity and most NAFLD guidelines. Guidelines are not uniform in whom to screen, and most guidelines do not specify how screening should be performed in practice. Screening for NAFLD remains controversial, due to lack of a highly accurate screening tool, limited knowledge to predict the natural course of NAFLD and limited data on its cost effectiveness. Guidelines provide little guidance on how screening should be performed. Screening for NAFLD remains controversial because not all conditions for screening are fully met. Consensus is needed on the optimal use of currently available screening tools. Research should focus on new accurate screening tool, the natural history of NAFLD and the cost effectiveness of different screening strategies in children. © 2017 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity Federation.

  14. About High-Performance Computing at NREL | High-Performance Computing |

    Science.gov Websites

    Day(s): First Thursday of every month Hours: 11 a.m. - 12 p.m. Location: ESIF B211-Edison Conference Room Contact: Jennifer Southerland Insight Center - Visualization Tools Day(s): Every Monday Hours: 10 Data System Day(s): Every Monday Hours: 10 a.m. - 11 a.m. Location: ESIF B308-Insight Center

  15. A study with ESI PAM-STAMP® on the influence of tool deformation on final part quality during a forming process

    NASA Astrophysics Data System (ADS)

    Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David

    2018-05-01

    As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.

  16. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    PubMed

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  17. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulationmore » Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.« less

  18. Use of a screening tool and primary health care gerontology nurse specialist for high-needs older people.

    PubMed

    King, Anna; Boyd, Michal; Dagley, Lynelle

    2017-02-01

    To describe implementation of an innovative gerontology nurse specialist role within one primary health organisation in Auckland, New Zealand. Quantitative outcomes of the screening tool as well as the nurse specialist assessment will be presented. The intervention involved use of the Brief Risk Identification for Geriatric Health Tool (BRIGHT) to identify high-needs older people with subsequent comprehensive geriatric assessment (CGA) performed by the gerontology nurse specialist. A total 384 of the 416 BRIGHTs were completed (92% response rate) and 15% of these were identified as high risk (n = 57). The BRIGHTs for high-risk older people revealed the highest scoring question was 'needing help with housework' (26%). The most frequent intervention by the gerontology nurse specialist was education (30%). The primary health care gerontology nurse specialist model delivers a proactive case finding and specialist gerontology intervention for older people at high risk of functional or health decline.

  19. COMBINE*: An integrated opto-mechanical tool for laser performance modeling

    NASA Astrophysics Data System (ADS)

    Rehak, M.; Di Nicola, J. M.

    2015-02-01

    Accurate modeling of thermal, mechanical and optical processes is important for achieving reliable, high-performance high energy lasers such as those at the National Ignition Facility [1] (NIF). The need for this capability is even more critical for high average power, high repetition rate applications. Modeling the effects of stresses and temperature fields on optical properties allows for optimal design of optical components and more generally of the architecture of the laser system itself. Stresses change the indices of refractions and induce inhomogeneities and anisotropy. We present a modern, integrated analysis tool that efficiently produces reliable results that are used in our laser propagation tools such as VBL [5]. COMBINE is built on and supplants the existing legacy tools developed for the previous generations of lasers at LLNL but also uses commercially available mechanical finite element codes ANSYS or COMSOL (including computational fluid dynamics). The COMBINE code computes birefringence and wave front distortions due to mechanical stresses on lenses and slabs of arbitrary geometry. The stresses calculated typically originate from mounting support, vacuum load, gravity, heat absorption and/or attending cooling. Of particular importance are the depolarization and detuning effects of nonlinear crystals due to thermal loading. Results are given in the form of Jones matrices, depolarization maps and wave front distributions. An incremental evaluation of Jones matrices and ray propagation in a 3D mesh with a stress and temperature field is performed. Wavefront and depolarization maps are available at the optical aperture and at slices within the optical element. The suite is validated, user friendly, supported, documented and amenable to collaborative development. * COMBINE stands for Code for Opto-Mechanical Birefringence Integrated Numerical Evaluations.

  20. An intelligent assistant for physicians.

    PubMed

    Gavrilis, Dimitris; Georgoulas, George; Vasiloglou, Nikolaos; Nikolakopoulos, George

    2016-08-01

    This paper presents a software tool developed for assisting physicians during an examination process. The tool consists of a number of modules with the aim to make the examination process not only quicker but also fault proof moving from a simple electronic medical records management system towards an intelligent assistant for the physician. The intelligent component exploits users' inputs as well as well established standards to line up possible suggestions for filling in the examination report. As the physician continues using it, the tool keeps extracting new knowledge. The architecture of the tool is presented in brief while the intelligent component which builds upon the notion of multilabel learning is presented in more detail. Our preliminary results from a real test case indicate that the performance of the intelligent module can reach quite high performance without a large amount of data.

  1. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needsmore » to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.« less

  2. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented

  3. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.

  4. How Can High-Biodiversity Coffee Make It to the Mainstream Market? The Performativity of Voluntary Sustainability Standards and Outcomes for Coffee Diversification

    NASA Astrophysics Data System (ADS)

    Solér, Cecilia; Sandström, Cecilia; Skoog, Hanna

    2017-02-01

    This article investigates the outcomes of mainstream coffee voluntary sustainability standards for high-biodiversity coffee diversification. By viewing voluntary sustainability standards certifications as performative marketing tools, we address the question of how such certification schemes affect coffee value creation based on unique biodiversity conservation properties in coffee farming. To date, the voluntary sustainability standards literature has primarily approached biodiversity conservation in coffee farming in the context of financial remuneration to coffee farmers. The performative analysis of voluntary sustainability standards certification undertaken in this paper, in which such certifications are analyzed in terms of their effect on mutually reinforcing representational, normalizing and exchange practices, provides an understanding of coffee diversification potential as dependent on standard criteria and voluntary sustainability standards certification as branding tools. We draw on a case of high-biodiversity, shade-grown coffee-farming practice in Kodagu, South-West India, which represents one of the world's biodiversity "hotspots".

  5. How Can High-Biodiversity Coffee Make It to the Mainstream Market? The Performativity of Voluntary Sustainability Standards and Outcomes for Coffee Diversification.

    PubMed

    Solér, Cecilia; Sandström, Cecilia; Skoog, Hanna

    2017-02-01

    This article investigates the outcomes of mainstream coffee voluntary sustainability standards for high-biodiversity coffee diversification. By viewing voluntary sustainability standards certifications as performative marketing tools, we address the question of how such certification schemes affect coffee value creation based on unique biodiversity conservation properties in coffee farming. To date, the voluntary sustainability standards literature has primarily approached biodiversity conservation in coffee farming in the context of financial remuneration to coffee farmers. The performative analysis of voluntary sustainability standards certification undertaken in this paper, in which such certifications are analyzed in terms of their effect on mutually reinforcing representational, normalizing and exchange practices, provides an understanding of coffee diversification potential as dependent on standard criteria and voluntary sustainability standards certification as branding tools. We draw on a case of high-biodiversity, shade-grown coffee-farming practice in Kodagu, South-West India, which represents one of the world's biodiversity "hotspots".

  6. Social transmission of tool use and tool manufacture in Goffin cockatoos (Cacatua goffini).

    PubMed

    Auersperg, A M I; von Bayern, A M I; Weber, S; Szabadvari, A; Bugnyar, T; Kacelnik, A

    2014-10-22

    Tool use can be inherited, or acquired as an individual innovation or by social transmission. Having previously reported individual innovative tool use and manufacture by a Goffin cockatoo, we used the innovator (Figaro, a male) as a demonstrator to investigate social transmission. Twelve Goffins saw either demonstrations by Figaro, or 'ghost' controls where tools and/or food were manipulated using magnets. Subjects observing demonstrations showed greater tool-related performance than ghost controls, with all three males in this group (but not the three females) acquiring tool-using competence. Two of these three males further acquired tool-manufacturing competence. As the actions of successful observers differed from those of the demonstrator, result emulation rather than high-fidelity imitation is the most plausible transmission mechanism.

  7. Social transmission of tool use and tool manufacture in Goffin cockatoos (Cacatua goffini)

    PubMed Central

    Auersperg, A. M. I.; von Bayern, A. M. I.; Weber, S.; Szabadvari, A.; Bugnyar, T.; Kacelnik, A.

    2014-01-01

    Tool use can be inherited, or acquired as an individual innovation or by social transmission. Having previously reported individual innovative tool use and manufacture by a Goffin cockatoo, we used the innovator (Figaro, a male) as a demonstrator to investigate social transmission. Twelve Goffins saw either demonstrations by Figaro, or ‘ghost’ controls where tools and/or food were manipulated using magnets. Subjects observing demonstrations showed greater tool-related performance than ghost controls, with all three males in this group (but not the three females) acquiring tool-using competence. Two of these three males further acquired tool-manufacturing competence. As the actions of successful observers differed from those of the demonstrator, result emulation rather than high-fidelity imitation is the most plausible transmission mechanism. PMID:25185997

  8. Optimization of Processing Parameters in ECM of Die Tool Steel Using Nanofluid by Multiobjective Genetic Algorithm.

    PubMed

    Sathiyamoorthy, V; Sekar, T; Elango, N

    2015-01-01

    Formation of spikes prevents achievement of the better material removal rate (MRR) and surface finish while using plain NaNO3 aqueous electrolyte in electrochemical machining (ECM) of die tool steel. Hence this research work attempts to minimize the formation of spikes in the selected workpiece of high carbon high chromium die tool steel using copper nanoparticles suspended in NaNO3 aqueous electrolyte, that is, nanofluid. The selected influencing parameters are applied voltage and electrolyte discharge rate with three levels and tool feed rate with four levels. Thirty-six experiments were designed using Design Expert 7.0 software and optimization was done using multiobjective genetic algorithm (MOGA). This tool identified the best possible combination for achieving the better MRR and surface roughness. The results reveal that voltage of 18 V, tool feed rate of 0.54 mm/min, and nanofluid discharge rate of 12 lit/min would be the optimum values in ECM of HCHCr die tool steel. For checking the optimality obtained from the MOGA in MATLAB software, the maximum MRR of 375.78277 mm(3)/min and respective surface roughness Ra of 2.339779 μm were predicted at applied voltage of 17.688986 V, tool feed rate of 0.5399705 mm/min, and nanofluid discharge rate of 11.998816 lit/min. Confirmatory tests showed that the actual performance at the optimum conditions was 361.214 mm(3)/min and 2.41 μm; the deviation from the predicted performance is less than 4% which proves the composite desirability of the developed models.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunshah, R.F.; Shabaik, A.H.

    The process of Activated Reactive Evaporation is used to synthesize superhard materials like carbides, oxides, nitrides, ultrafine grain cermets. The deposits are characterized by hardness, microstructure and lattice parameter measurements. The synthesis and characterization of TiC-Ni cermets, Al/sub 2/O/sub 3/ and VC-TiC alloy carbides is given. Tools of different coating characteristics are tested for machining performance at different speeds and feeds. The machining evaluation and the selection of coatings is based on the rate of deterioration of the costing, tool temperature, and cutting forces. Tool life tests show coated high speed steel tools show a 300% improvement in tool life.more » (Author) (GRA)« less

  10. The Surgical Safety Checklist and Teamwork Coaching Tools: a study of inter-rater reliability.

    PubMed

    Huang, Lyen C; Conley, Dante; Lipsitz, Stu; Wright, Christopher C; Diller, Thomas W; Edmondson, Lizabeth; Berry, William R; Singer, Sara J

    2014-08-01

    To assess the inter-rater reliability (IRR) of two novel observation tools for measuring surgical safety checklist performance and teamwork. Data surgical safety checklists can promote adherence to standards of care and improve teamwork in the operating room. Their use has been associated with reductions in mortality and other postoperative complications. However, checklist effectiveness depends on how well they are performed. Authors from the Safe Surgery 2015 initiative developed a pair of novel observation tools through literature review, expert consultation and end-user testing. In one South Carolina hospital participating in the initiative, two observers jointly attended 50 surgical cases and independently rated surgical teams using both tools. We used descriptive statistics to measure checklist performance and teamwork at the hospital. We assessed IRR by measuring percent agreement, Cohen's κ, and weighted κ scores. The overall percent agreement and κ between the two observers was 93% and 0.74 (95% CI 0.66 to 0.79), respectively, for the Checklist Coaching Tool and 86% and 0.84 (95% CI 0.77 to 0.90) for the Surgical Teamwork Tool. Percent agreement for individual sections of both tools was 79% or higher. Additionally, κ scores for six of eight sections on the Checklist Coaching Tool and for two of five domains on the Surgical Teamwork Tool achieved the desired 0.7 threshold. However, teamwork scores were high and variation was limited. There were no significant changes in the percent agreement or κ scores between the first 10 and last 10 cases observed. Both tools demonstrated substantial IRR and required limited training to use. These instruments may be used to observe checklist performance and teamwork in the operating room. However, further refinement and calibration of observer expectations, particularly in rating teamwork, could improve the utility of the tools. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Evaluation of a telerobotic system to assist surgeons in microsurgery

    NASA Technical Reports Server (NTRS)

    Das, H.; Zak, H.; Johnson, J.; Crouch, J.; Frambach, D.

    1999-01-01

    A tool was developed that assists surgeons in manipulating surgical instruments more precisely than is possible manually. The tool is a telemanipulator that scales down the surgeon's hand motion and filters tremor in the motion. The signals measured from the surgeon's hand are transformed and used to drive a six-degrees-of-freedom robot to position the surgical instrument mounted on its tip. A pilot study comparing the performance of the telemanipulator system against manual instrument positioning was conducted at the University of Southern California School of Medicine. The results show that a telerobotic tool can improve the performance of a microsurgeon by increasing the precision with which he can position surgical instruments, but this is achieved at the cost of increased time in performing the task. We believe that this technology will extend the capabilities of microsurgeons and allow more surgeons to perform highly skilled procedures currently performed only by the best surgeons. It will also enable performance of new surgical procedures that are beyond the capabilities of even the most skilled surgeons. Copyright 1999 Wiley-Liss, Inc.

  12. Reversal Learning Task in Children with Autism Spectrum Disorder: A Robot-Based Approach

    ERIC Educational Resources Information Center

    Costescu, Cristina A.; Vanderborght, Bram; David, Daniel O.

    2015-01-01

    Children with autism spectrum disorder (ASD) engage in highly perseverative and inflexible behaviours. Technological tools, such as robots, received increased attention as social reinforces and/or assisting tools for improving the performance of children with ASD. The aim of our study is to investigate the role of the robotic toy Keepon in a…

  13. Automated condition-invariable neurite segmentation and synapse classification using textural analysis-based machine-learning algorithms

    PubMed Central

    Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly

    2013-01-01

    High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652

  14. On the use of high-frequency SCADA data for improved wind turbine performance monitoring

    NASA Astrophysics Data System (ADS)

    Gonzalez, E.; Stephen, B.; Infield, D.; Melero, J. J.

    2017-11-01

    SCADA-based condition monitoring of wind turbines facilitates the move from costly corrective repairs towards more proactive maintenance strategies. In this work, we advocate the use of high-frequency SCADA data and quantile regression to build a cost effective performance monitoring tool. The benefits of the approach are demonstrated through the comparison between state-of-the-art deterministic power curve modelling techniques and the suggested probabilistic model. Detection capabilities are compared for low and high-frequency SCADA data, providing evidence for monitoring at higher resolutions. Operational data from healthy and faulty turbines are used to provide a practical example of usage with the proposed tool, effectively achieving the detection of an incipient gearbox malfunction at a time horizon of more than one month prior to the actual occurrence of the failure.

  15. Recognizing chemicals in patents: a comparative analysis.

    PubMed

    Habibi, Maryam; Wiegandt, David Luis; Schmedding, Florian; Leser, Ulf

    2016-01-01

    Recently, methods for Chemical Named Entity Recognition (NER) have gained substantial interest, driven by the need for automatically analyzing todays ever growing collections of biomedical text. Chemical NER for patents is particularly essential due to the high economic importance of pharmaceutical findings. However, NER on patents has essentially been neglected by the research community for long, mostly because of the lack of enough annotated corpora. A recent international competition specifically targeted this task, but evaluated tools only on gold standard patent abstracts instead of full patents; furthermore, results from such competitions are often difficult to extrapolate to real-life settings due to the relatively high homogeneity of training and test data. Here, we evaluate the two state-of-the-art chemical NER tools, tmChem and ChemSpot, on four different annotated patent corpora, two of which consist of full texts. We study the overall performance of the tools, compare their results at the instance level, report on high-recall and high-precision ensembles, and perform cross-corpus and intra-corpus evaluations. Our findings indicate that full patents are considerably harder to analyze than patent abstracts and clearly confirm the common wisdom that using the same text genre (patent vs. scientific) and text type (abstract vs. full text) for training and testing is a pre-requisite for achieving high quality text mining results.

  16. CATO: a CAD tool for intelligent design of optical networks and interconnects

    NASA Astrophysics Data System (ADS)

    Chlamtac, Imrich; Ciesielski, Maciej; Fumagalli, Andrea F.; Ruszczyk, Chester; Wedzinga, Gosse

    1997-10-01

    Increasing communication speed requirements have created a great interest in very high speed optical and all-optical networks and interconnects. The design of these optical systems is a highly complex task, requiring the simultaneous optimization of various parts of the system, ranging from optical components' characteristics to access protocol techniques. Currently there are no computer aided design (CAD) tools on the market to support the interrelated design of all parts of optical communication systems, thus the designer has to rely on costly and time consuming testbed evaluations. The objective of the CATO (CAD tool for optical networks and interconnects) project is to develop a prototype of an intelligent CAD tool for the specification, design, simulation and optimization of optical communication networks. CATO allows the user to build an abstract, possible incomplete, model of the system, and determine its expected performance. Based on design constraints provided by the user, CATO will automatically complete an optimum design, using mathematical programming techniques, intelligent search methods and artificial intelligence (AI). Initial design and testing of a CATO prototype (CATO-1) has been completed recently. The objective was to prove the feasibility of combining AI techniques, simulation techniques, an optical device library and a graphical user interface into a flexible CAD tool for obtaining optimal communication network designs in terms of system cost and performance. CATO-1 is an experimental tool for designing packet-switching wavelength division multiplexing all-optical communication systems using a LAN/MAN ring topology as the underlying network. The two specific AI algorithms incorporated are simulated annealing and a genetic algorithm. CATO-1 finds the optimal number of transceivers for each network node, using an objective function that includes the cost of the devices and the overall system performance.

  17. A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50)

    NASA Astrophysics Data System (ADS)

    Kern, Bastian; Jöckel, Patrick

    2016-10-01

    Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.

  18. State-of-the-Art for Hygrothermal Simulation Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudreaux, Philip R.; New, Joshua Ryan; Shrestha, Som S.

    2017-02-01

    The hygrothermal (heat and moisture) performance of buildings can be assessed by utilizing simulation tools. There are currently a number of available hygrothermal calculation tools available which vary in their degree of sophistication and runtime requirements. This report investigates three of the most commonly used models (WUFI, HAMT, and EMPD) to assess their limitations and potential to generate physically realistic results to prioritize improvements for EnergyPlus (which uses HAMT and EMPD). The outcome of the study shows that, out of these three tools, WUFI has the greatest hygrothermal capabilities. Limitations of these tools were also assessed including: WUFI’s inability tomore » properly account for air leakage and transfer at surface boundaries; HAMT’s inability to handle air leakage, precipitationrelated moisture problems, or condensation problems from high relative humidity; and multiple limitations for EMPD as a simplified method to estimate indoor temperature and humidity levels and generally not used to estimate the hygrothermal performance of the building envelope materials. In conclusion, out of the three investigated simulation tools, HAMT has the greatest modeling potential, is open source, and we have prioritized specific features that can enable EnergyPlus to model all relevant heat and moisture transfer mechanisms that impact the performance of building envelope components.« less

  19. Investigation of the effects of process and geometrical parameters on formability in tube hydroforming using a modular hydroforming tool

    NASA Astrophysics Data System (ADS)

    Joghan, Hamed Dardaei; Staupendahl, Daniel; Hassan, Hamad ul; Henke, Andreas; Keesser, Thorsten; Legat, Francois; Tekkaya, A. Erman

    2018-05-01

    Tube hydroforming is one of the most important manufacturing processes for the production of exhaust systems. Tube hydroforming allows generating parts with highly complex geometries with the forming accuracies needed in the automotive sector. This is possible due to the form-closed nature of the production process. One of the main cost drivers is tool manufacturing, which is expensive and time consuming, especially when forming large parts. To cope with the design trend of individuality, which is gaining more and more importance and leads to a high number of product variants, a new flexible tool design was developed. The designed tool offers a high flexibility in manufacturing different shapes and geometries of tubes with just local alterations and relocation of tool segments. The tolerancing problems that segmented tools from the state of the art have are overcome by an innovative and flexible die holder design. The break-even point of this initially more expensive tool design is already overcome when forming more than 4 different tube shapes. Together with an additionally designed rotary hydraulic tube feeding system, a highly adaptable forming setup is generated. To investigate the performance of the developed tool setup, a study on geometrical and process parameters during forming of a spherical dome was done. Austenitic stainless steel (grade 1.4301) tube with a diameter of 40 mm and a thickness of 1.5 mm was used for the investigations. The experimental analyses were supported by finite element simulations and statistical analyses. The results show that the flexible tool setup can efficiently be used to analyze the interaction of the inner pressure, friction, and the location of the spherical dome and demonstrate the high influence of the feeding rate on the formed part.

  20. Design and Testing of a Tool for Evaluating the Quality of Diabetes Consumer-Information Web Sites

    PubMed Central

    Steinwachs, Donald; Rubin, Haya R

    2003-01-01

    Background Most existing tools for measuring the quality of Internet health information focus almost exclusively on structural criteria or other proxies for quality information rather than evaluating actual accuracy and comprehensiveness. Objective This research sought to develop a new performance-measurement tool for evaluating the quality of Internet health information, test the validity and reliability of the tool, and assess the variability in diabetes Web site quality. Methods An objective, systematic tool was developed to evaluate Internet diabetes information based on a quality-of-care measurement framework. The principal investigator developed an abstraction tool and trained an external reviewer on its use. The tool included 7 structural measures and 34 performance measures created by using evidence-based practice guidelines and experts' judgments of accuracy and comprehensiveness. Results Substantial variation existed in all categories, with overall scores following a normal distribution and ranging from 15% to 95% (mean was 50% and median was 51%). Lin's concordance correlation coefficient to assess agreement between raters produced a rho of 0.761 (Pearson's r of 0.769), suggesting moderate to high agreement. The average agreement between raters for the performance measures was 0.80. Conclusions Diabetes Web site quality varies widely. Alpha testing of this new tool suggests that it could become a reliable and valid method for evaluating the quality of Internet health sites. Such an instrument could help lay people distinguish between beneficial and misleading information. PMID:14713658

  1. Optimization of Microelectronic Devices for Sensor Applications

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Klimeck, Gerhard

    2000-01-01

    The NASA/JPL goal to reduce payload in future space missions while increasing mission capability demands miniaturization of active and passive sensors, analytical instruments and communication systems among others. Currently, typical system requirements include the detection of particular spectral lines, associated data processing, and communication of the acquired data to other systems. Advances in lithography and deposition methods result in more advanced devices for space application, while the sub-micron resolution currently available opens a vast design space. Though an experimental exploration of this widening design space-searching for optimized performance by repeated fabrication efforts-is unfeasible, it does motivate the development of reliable software design tools. These tools necessitate models based on fundamental physics and mathematics of the device to accurately model effects such as diffraction and scattering in opto-electronic devices, or bandstructure and scattering in heterostructure devices. The software tools must have convenient turn-around times and interfaces that allow effective usage. The first issue is addressed by the application of high-performance computers and the second by the development of graphical user interfaces driven by properly developed data structures. These tools can then be integrated into an optimization environment, and with the available memory capacity and computational speed of high performance parallel platforms, simulation of optimized components can proceed. In this paper, specific applications of the electromagnetic modeling of infrared filtering, as well as heterostructure device design will be presented using genetic algorithm global optimization methods.

  2. Benchmarking CRISPR on-target sgRNA design.

    PubMed

    Yan, Jifang; Chuai, Guohui; Zhou, Chi; Zhu, Chenyu; Yang, Jing; Zhang, Chao; Gu, Feng; Xu, Han; Wei, Jia; Liu, Qi

    2017-02-15

    CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-based gene editing has been widely implemented in various cell types and organisms. A major challenge in the effective application of the CRISPR system is the need to design highly efficient single-guide RNA (sgRNA) with minimal off-target cleavage. Several tools are available for sgRNA design, while limited tools were compared. In our opinion, benchmarking the performance of the available tools and indicating their applicable scenarios are important issues. Moreover, whether the reported sgRNA design rules are reproducible across different sgRNA libraries, cell types and organisms remains unclear. In our study, a systematic and unbiased benchmark of the sgRNA predicting efficacy was performed on nine representative on-target design tools, based on six benchmark data sets covering five different cell types. The benchmark study presented here provides novel quantitative insights into the available CRISPR tools. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  4. High-energy x-ray scattering studies of battery materials

    DOE PAGES

    Glazer, Matthew P. B.; Okasinski, John S.; Almer, Jonathan D.; ...

    2016-06-08

    High-energy x-ray (HEX) scattering is a sensitive and powerful tool to nondestructively probe the atomic and mesoscale structures of battery materials under synthesis and operational conditions. The penetration power of HEXs enables the use of large, practical samples and realistic environments, allowing researchers to explore the inner workings of batteries in both laboratory and commercial formats. This article highlights the capability and versatility of HEX techniques, particularly from synchrotron sources, to elucidate materials synthesis processes and thermal instability mechanisms in situ, to understand (dis)charging mechanisms in operando under a variety of cycling conditions, and to spatially resolve electrode/electrolyte responses tomore » highlight connections between inhomogeneity and performance. Such studies have increased our understanding of the fundamental mechanisms underlying battery performance. Here, by deepening our understanding of the linkages between microstructure and overall performance, HEXs represent a powerful tool for validating existing batteries and shortening battery-development timelines.« less

  5. High-energy x-ray scattering studies of battery materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glazer, Matthew P. B.; Okasinski, John S.; Almer, Jonathan D.

    High-energy x-ray (HEX) scattering is a sensitive and powerful tool to nondestructively probe the atomic and mesoscale structures of battery materials under synthesis and operational conditions. The penetration power of HEXs enables the use of large, practical samples and realistic environments, allowing researchers to explore the inner workings of batteries in both laboratory and commercial formats. This article highlights the capability and versatility of HEX techniques, particularly from synchrotron sources, to elucidate materials synthesis processes and thermal instability mechanisms in situ, to understand (dis)charging mechanisms in operando under a variety of cycling conditions, and to spatially resolve electrode/electrolyte responses tomore » highlight connections between inhomogeneity and performance. Such studies have increased our understanding of the fundamental mechanisms underlying battery performance. Here, by deepening our understanding of the linkages between microstructure and overall performance, HEXs represent a powerful tool for validating existing batteries and shortening battery-development timelines.« less

  6. Supersonic civil airplane study and design: Performance and sonic boom

    NASA Technical Reports Server (NTRS)

    Cheung, Samson

    1995-01-01

    Since aircraft configuration plays an important role in aerodynamic performance and sonic boom shape, the configuration of the next generation supersonic civil transport has to be tailored to meet high aerodynamic performance and low sonic boom requirements. Computational fluid dynamics (CFD) can be used to design airplanes to meet these dual objectives. The work and results in this report are used to support NASA's High Speed Research Program (HSRP). CFD tools and techniques have been developed for general usages of sonic boom propagation study and aerodynamic design. Parallel to the research effort on sonic boom extrapolation, CFD flow solvers have been coupled with a numeric optimization tool to form a design package for aircraft configuration. This CFD optimization package has been applied to configuration design on a low-boom concept and an oblique all-wing concept. A nonlinear unconstrained optimizer for Parallel Virtual Machine has been developed for aerodynamic design and study.

  7. New perspectives in hydrodynamic radial polishing techniques for optical surfaces

    NASA Astrophysics Data System (ADS)

    Ruiz, Elfego; Sohn, Erika; Luna, Esteban; Salas, Luis; Cordero, Alberto; González, Jorge; Núñez, Manuel; Salinas, Javier; Cruz-González, Irene; Valdés, Jorge; Cabrera, Victor; Martínez, Benjamín

    2004-09-01

    In order to overcome classic polishing techniques, a novel hydrodynamic radial polishing tool (HyDRa) is presented; it is useful for the corrective lapping and fine polishing of diverse materials by means of a low-cost abrasive flux and a hydrostatic suspension system that avoids contact of the tool with the working surface. This tool enables the work on flat or curved surfaces of currently up to two and a half meters in diameter. It has the advantage of avoiding fallen edges during the polishing process as well as reducing tool wear out and deformation. The functioning principle is based on the generation of a high-velocity, high-pressure, abrasive emulsion flux with radial geometry. The polishing process is repeatable by means of the control of the tool operational parameters, achieving high degrees of precision and accuracy on optical and semiconductor surfaces, with removal rates of up to 9 mm3/hour and promising excellent surface polishing qualities. An additional advantage of this new tool is the possibility to perform interferometric measurements during the polishing process without the need of dismounting the working surface. A series of advantages of this method, numerical simulations and experimental results are described.

  8. High-performance wire-grid polarizers using jet and Flash™ imprint lithography

    NASA Astrophysics Data System (ADS)

    Ahn, Se Hyun; Yang, Shuqiang; Miller, Mike; Ganapathisubramanian, Maha; Menezes, Marlon; Choi, Jin; Xu, Frank; Resnick, Douglas J.; Sreenivasan, S. V.

    2013-07-01

    Extremely large-area roll-to-roll (R2R) manufacturing on flexible substrates is ubiquitous for applications such as paper and plastic processing. It combines the benefits of high speed and inexpensive substrates to deliver a commodity product at low cost. The challenge is to extend this approach to the realm of nanopatterning and realize similar benefits. In order to achieve low-cost nanopatterning, it is imperative to move toward high-speed imprinting, less complex tools, near zero waste of consumables, and low-cost substrates. We have developed a roll-based J-FIL process and applied it to a technology demonstrator tool, the LithoFlex 100, to fabricate large-area flexible bilayer wire-grid polarizers (WGPs) and high-performance WGPs on rigid glass substrates. Extinction ratios of better than 10,000 are obtained for the glass-based WGPs. Two simulation packages are also employed to understand the effects of pitch, aluminum thickness, and pattern defectivity on the optical performance of the WGP devices. It is determined that the WGPs can be influenced by both clear and opaque defects in the gratings; however, the defect densities are relaxed relative to the requirements of a high-density semiconductor device.

  9. Just-in-time adaptive disturbance estimation for run-to-run control of photolithography overlay

    NASA Astrophysics Data System (ADS)

    Firth, Stacy K.; Campbell, W. J.; Edgar, Thomas F.

    2002-07-01

    One of the main challenges to implementations of traditional run-to-run control in the semiconductor industry is a high mix of products in a single factory. To address this challenge, Just-in-time Adaptive Disturbance Estimation (JADE) has been developed. JADE uses a recursive weighted least-squares parameters estimation technique to identify the contributions to variation that are dependent on product, as well as the tools on which the lot was processed. As applied to photolithography overlay, JADE assigns these sources of variation to contributions from the context items: tool, product, reference tool, and reference reticle. Simulations demonstrate that JADE effectively identifies disturbances in contributing context items when the variations are known to be additive. The superior performance of JADE over traditional EWMA is also shown in these simulations. The results of application of JADE to data from a high mix production facility show that JADE still performs better than EWMA, even with the challenges of a real manufacturing environment.

  10. Simulating Effects of High Angle of Attack on Turbofan Engine Performance

    NASA Technical Reports Server (NTRS)

    Liu, Yuan; Claus, Russell W.; Litt, Jonathan S.; Guo, Ten-Huei

    2013-01-01

    A method of investigating the effects of high angle of attack (AOA) flight on turbofan engine performance is presented. The methodology involves combining a suite of diverse simulation tools. Three-dimensional, steady-state computational fluid dynamics (CFD) software is used to model the change in performance of a commercial aircraft-type inlet and fan geometry due to various levels of AOA. Parallel compressor theory is then applied to assimilate the CFD data with a zero-dimensional, nonlinear, dynamic turbofan engine model. The combined model shows that high AOA operation degrades fan performance and, thus, negatively impacts compressor stability margins and engine thrust. In addition, the engine response to high AOA conditions is shown to be highly dependent upon the type of control system employed.

  11. [Application of water jet ERBEJET 2 in salivary glands surgery].

    PubMed

    Gasiński, Mateusz; Modrzejewski, Maciej; Cenda, Paweł; Nazim-Zygadło, Elzbieta; Kozok, Andrzej; Dobosz, Paweł

    2009-09-01

    Anatomical location of salivary glands requires from surgeon high precision during the operation in this site. Waterjet is one of the modern tools which allows to perform "minimal invasive" operating procedure. This tool helps to separate pathological structures from healthy tissue with a stream of high pressure saline pumped to the operating area via special designed applicators. Stream of fluid is generated by double piston pummp under 1 to 80 bar pressure that can be regulated. This allows to precise remove tumors, spare nerves and vessels in glandular tissue and minimize use of electrocoagulation. Waterjet is a modern tool that can help to improve the safety of patients and comfort of surgeon's work.

  12. Local Alignment Tool Based on Hadoop Framework and GPU Architecture

    PubMed Central

    Hung, Che-Lun; Hua, Guan-Jie

    2014-01-01

    With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance. PMID:24955362

  13. Local alignment tool based on Hadoop framework and GPU architecture.

    PubMed

    Hung, Che-Lun; Hua, Guan-Jie

    2014-01-01

    With the rapid growth of next generation sequencing technologies, such as Slex, more and more data have been discovered and published. To analyze such huge data the computational performance is an important issue. Recently, many tools, such as SOAP, have been implemented on Hadoop and GPU parallel computing architectures. BLASTP is an important tool, implemented on GPU architectures, for biologists to compare protein sequences. To deal with the big biology data, it is hard to rely on single GPU. Therefore, we implement a distributed BLASTP by combining Hadoop and multi-GPUs. The experimental results present that the proposed method can improve the performance of BLASTP on single GPU, and also it can achieve high availability and fault tolerance.

  14. Visiting Vehicle Ground Trajectory Tool

    NASA Technical Reports Server (NTRS)

    Hamm, Dustin

    2013-01-01

    The International Space Station (ISS) Visiting Vehicle Group needed a targeting tool for vehicles that rendezvous with the ISS. The Visiting Vehicle Ground Trajectory targeting tool provides the ability to perform both realtime and planning operations for the Visiting Vehicle Group. This tool provides a highly reconfigurable base, which allows the Visiting Vehicle Group to perform their work. The application is composed of a telemetry processing function, a relative motion function, a targeting function, a vector view, and 2D/3D world map type graphics. The software tool provides the ability to plan a rendezvous trajectory for vehicles that visit the ISS. It models these relative trajectories using planned and realtime data from the vehicle. The tool monitors ongoing rendezvous trajectory relative motion, and ensures visiting vehicles stay within agreed corridors. The software provides the ability to update or re-plan a rendezvous to support contingency operations. Adding new parameters and incorporating them into the system was previously not available on-the-fly. If an unanticipated capability wasn't discovered until the vehicle was flying, there was no way to update things.

  15. Estimating learning outcomes from pre- and posttest student self-assessments: a longitudinal study.

    PubMed

    Schiekirka, Sarah; Reinhardt, Deborah; Beißbarth, Tim; Anders, Sven; Pukrop, Tobias; Raupach, Tobias

    2013-03-01

    Learning outcome is an important measure for overall teaching quality and should be addressed by comprehensive evaluation tools. The authors evaluated the validity of a novel evaluation tool based on student self-assessments, which may help identify specific strengths and weaknesses of a particular course. In 2011, the authors asked 145 fourth-year students at Göttingen Medical School to self-assess their knowledge on 33 specific learning objectives in a pretest and posttest as part of a cardiorespiratory module. The authors compared performance gain calculated from self-assessments with performance gain derived from formative examinations that were closely matched to these 33 learning objectives. Eighty-three students (57.2%) completed the assessment. There was good agreement between performance gain derived from subjective data and performance gain derived from objective examinations (Pearson r=0.78; P<.0001) on the group level. The association between the two measures was much weaker when data were analyzed on the individual level. Further analysis determined a quality cutoff for performance gain derived from aggregated student self-assessments. When using this cutoff, the evaluation tool was highly sensitive in identifying specific learning objectives with favorable or suboptimal objective performance gains. The tool is easy to implement, takes initial performance levels into account, and does not require extensive pre-post testing. By providing valid estimates of actual performance gain obtained during a teaching module, it may assist medical teachers in identifying strengths and weaknesses of a particular course on the level of specific learning objectives.

  16. How Toddlers Acquire and Transfer Tool Knowledge: Developmental Changes and the Role of Executive Functions.

    PubMed

    Pauen, Sabina; Bechtel-Kuehne, Sabrina

    2016-07-01

    This report investigates tool learning and its relations to executive functions (EFs) in toddlers. In Study 1 (N = 93), 18-, 20-, 22-, and 24-month-old children learned equally well to choose a correct tool from observation, whereas performance based on feedback improved with age. Knowledge transfer showed significant progress after 22 months of age: Older children ignored irrelevant features more easily and adjusted their behavior more flexibly. Study 2 (N = 62) revealed that spontaneous transfer in 22- to 24-month-olds was related to set-shifting skills and response inhibition. Flexible adaptation to feedback correlated with working-memory capacity. These findings suggest that toddlerhood is a highly dynamic phase of tool learning and that EFs are related to transfer performance at this age. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.

  17. Development of RAD-Score: A Tool to Assess the Procedural Competence of Diagnostic Radiology Residents.

    PubMed

    Isupov, Inga; McInnes, Matthew D F; Hamstra, Stan J; Doherty, Geoffrey; Gupta, Ashish; Peddle, Susan; Jibri, Zaid; Rakhra, Kawan; Hibbert, Rebecca M

    2017-04-01

    The purpose of this study is to develop a tool to assess the procedural competence of radiology trainees, with sources of evidence gathered from five categories to support the construct validity of tool: content, response process, internal structure, relations to other variables, and consequences. A pilot form for assessing procedural competence among radiology residents, known as the RAD-Score tool, was developed by evaluating published literature and using a modified Delphi procedure involving a group of local content experts. The pilot version of the tool was tested by seven radiology department faculty members who evaluated procedures performed by 25 residents at one institution between October 2014 and June 2015. Residents were evaluated while performing multiple procedures in both clinical and simulation settings. The main outcome measure was the percentage of residents who were considered ready to perform procedures independently, with testing conducted to determine differences between levels of training. A total of 105 forms (for 52 procedures performed in a clinical setting and 53 procedures performed in a simulation setting) were collected for a variety of procedures (eight vascular or interventional, 42 body, 12 musculoskeletal, 23 chest, and 20 breast procedures). A statistically significant difference was noted in the percentage of trainees who were rated as being ready to perform a procedure independently (in postgraduate year [PGY] 2, 12% of residents; in PGY3, 61%; in PGY4, 85%; and in PGY5, 88%; p < 0.05); this difference persisted in the clinical and simulation settings. User feedback and psychometric analysis were used to create a final version of the form. This prospective study describes the successful development of a tool for assessing the procedural competence of radiology trainees with high levels of construct validity in multiple domains. Implementation of the tool in the radiology residency curriculum is planned and can play an instrumental role in the transition to competency-based radiology training.

  18. Combination of Ultrasonic Vibration and Cryogenic Cooling for Cutting Performance Improvement of Inconel 718 Turning

    NASA Astrophysics Data System (ADS)

    Lin, S. Y.; Chung, C. T.; Cheng, Y. Y.

    2011-01-01

    The main objective of this study is to develop a thermo-elastic-plastic coupling model, based on a combination skill of ultrasonically assisted cutting and cryogenic cooling, under large deformation for Inconel 718 alloy machining process. The improvement extent on cutting performance and tool life promotion may be examined from this investigation. The critical value of the strain energy density of the workpiece will be utilized as the chip separation and the discontinuous chip segmentation criteria. The forced convection cooling and a hydrodynamic lubrication model will be considered and formulated in the model. Finite element method will be applied to create a complete numerical solution for this ultrasonic vibration cutting model. During the analysis, the cutting tool is incrementally advanced forward with superimposed ultrasonic vibration in a back and forth step-by-step manner, from an incipient stage of tool-workpiece engagement to a steady state of chip formation, a whole simulation of orthogonal cutting process under plane strain deformation is thus undertaken. High shear strength induces a fluctuation phenomenon of shear angle, high shear strain rate, variation of chip types and chip morphology, tool-chip contact length variation, the temperature distributions within the workpiece, chip and tool, periodic fluctuation in cutting forces can be determined from the developed model. A complete comparison of machining characteristics between some different combinations of ultrasonically assisted cutting and cryogenic cooling with conventional cutting operation can be acquired. Finally, the high-speed turning experiment for Inconel 718 alloy will be taken in the laboratory to validate the accuracy of the model, and the progressive flank wear, crater wear, notching and chipping of the tool edge can also be measured in the experiments.

  19. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  20. Development of Minimally Invasive Medical Tools Using Laser Processing on Cylindrical Substrates

    NASA Astrophysics Data System (ADS)

    Haga, Yoichi; Muyari, Yuta; Goto, Shoji; Matsunaga, Tadao; Esashi, Masayoshi

    This paper reports micro-fabrication techniques using laser processing on cylindrical substrates for the realization of high-performance multifunctional minimally invasive medical tools with small sizes. A spring-shaped shape memory alloy (SMA) micro-coil with a square cross section has been fabricated by spiral cutting of a Ti-Ni SMA tube with a femtosecond laser. Small diameter active bending catheter which is actuated by hydraulic suction mechanism for intravascular minimally invasive diagnostics and therapy has also been developed. The catheter is made of a Ti-Ni super elastic alloy (SEA) tube which is processed by laser micromachining and a silicone rubber tube which covers the outside of the SEA tube. The active catheter is effective for insertion in branch of blood vessel which diverse in acute angle which is difficult to proceed. Multilayer metallization and patterning have been performed on glass tubes with 2 and 3 mm external diameters using maskless lithography techniques using a laser exposure system. Using laser soldering technique, a integrated circuit parts have been mounted on a multilayer circuit patterned on a glass tube. These fabrication techniques will effective for realization of high-performance multifunctional catheters, endoscopic tools, and implanted small capsules.

  1. Considerations on the Use of Custom Accelerators for Big Data Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Minutoli, Marco

    Accelerators, including Graphic Processing Units (GPUs) for gen- eral purpose computation, many-core designs with wide vector units (e.g., Intel Phi), have become a common component of many high performance clusters. The appearance of more stable and reliable tools tools that can automatically convert code written in high-level specifications with annotations (such as C or C++) to hardware de- scription languages (High-Level Synthesis - HLS), is also setting the stage for a broader use of reconfigurable devices (e.g., Field Pro- grammable Gate Arrays - FPGAs) in high performance system for the implementation of custom accelerators, helped by the fact that newmore » processors include advanced cache-coherent interconnects for these components. In this chapter, we briefly survey the status of the use of accelerators in high performance systems targeted at big data analytics applications. We argue that, although the progress in the use of accelerators for this class of applications has been sig- nificant, differently from scientific simulations there still are gaps to close. This is particularly true for the ”irregular” behaviors exhibited by no-SQL, graph databases. We focus our attention on the limits of HLS tools for data analytics and graph methods, and discuss a new architectural template that better fits the requirement of this class of applications. We validate the new architectural templates by mod- ifying the Graph Engine for Multithreaded System (GEMS) frame- work to support accelerators generated with such a methodology, and testing with queries coming from the Lehigh University Benchmark (LUBM). The architectural template enables better supporting the task and memory level parallelism present in graph methods by sup- porting a new control model and a enhanced memory interface. We show that out solution allows generating parallel accelerators, pro- viding speed ups with respect to conventional HLS flows. We finally draw conclusions and present a perspective on the use of reconfig- urable devices and Design Automation tools for data analytics.« less

  2. Experimental and Numerical Examination of the Thermal Transmittance of High Performance Window Frames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustavsen Ph.D., Arild; Goudey, Howdy; Kohler, Christian

    2010-06-17

    While window frames typically represent 20-30percent of the overall window area, their impact on the total window heat transfer rates may be much larger. This effect is even greater in low-conductance (highly insulating) windows which incorporate very low conductance glazings. Developing low-conductance window frames requires accurate simulation tools for product research and development. The Passivhaus Institute in Germany states that windows (glazing and frames, combined) should have U-values not exceeding 0.80 W/(m??K). This has created a niche market for highly insulating frames, with frame U-values typically around 0.7-1.0 W/(m2 cdot K). The U-values reported are often based on numerical simulationsmore » according to international simulation standards. It is prudent to check the accuracy of these calculation standards, especially for high performance products before more manufacturers begin to use them to improve other product offerings. In this paper the thermal transmittance of five highly insulating window frames (three wooden frames, one aluminum frame and one PVC frame), found from numerical simulations and experiments, are compared. Hot box calorimeter results are compared with numerical simulations according to ISO 10077-2 and ISO 15099. In addition CFD simulations have been carried out, in order to use the most accurate tool available to investigate the convection and radiation effects inside the frame cavities. Our results show that available tools commonly used to evaluate window performance, based on ISO standards, give good overall agreement, but specific areas need improvement.« less

  3. A flexible tool for hydraulic and water quality performance analysis of green infrastructure

    NASA Astrophysics Data System (ADS)

    Massoudieh, A.; Alikhani, J.

    2017-12-01

    Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. To be used to evaluate the effect design configurations on the long-term performance of GIs, models should be able to consider processes within GIs with good fidelity. In this presentation, a sophisticated, yet flexible tool for hydraulic and water quality assessment of GIs will be introduced. The tool can be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media employed in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biogeochemical processes affecting contaminants such as evapotranspiration, plant uptake, reactions, and particle-associated transport accurately while maintaining a high degree of flexibility to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated. The process-based model framework developed here can be used to model a diverse range of GI practices such as stormwater ponds, green roofs, retention ponds, bioretention systems, infiltration trench, permeable pavement and other custom-designed combinatory systems. An example of the application of the system to evaluate the performance of a rain-garden system will be demonstrated.

  4. Machinability of Green Powder Metallurgy Components: Part I. Characterization of the Influence of Tool Wear

    NASA Astrophysics Data System (ADS)

    Robert-Perron, Etienne; Blais, Carl; Pelletier, Sylvain; Thomas, Yannig

    2007-06-01

    The green machining process is an interesting approach for solving the mediocre machining behavior of high-performance powder metallurgy (PM) steels. This process appears as a promising method for extending tool life and reducing machining costs. Recent improvements in binder/lubricant technologies have led to high green strength systems that enable green machining. So far, tool wear has been considered negligible when characterizing the machinability of green PM specimens. This inaccurate assumption may lead to the selection of suboptimum cutting conditions. The first part of this study involves the optimization of the machining parameters to minimize the effects of tool wear on the machinability in turning of green PM components. The second part of our work compares the sintered mechanical properties of components machined in green state with other machined after sintering.

  5. Validity and applicability of a video-based animated tool to assess mobility in elderly Latin American populations

    PubMed Central

    Guerra, Ricardo Oliveira; Oliveira, Bruna Silva; Alvarado, Beatriz Eugenia; Curcio, Carmen Lucia; Rejeski, W Jack; Marsh, Anthony P; Ip, Edward H; Barnard, Ryan T; Guralnik, Jack M; Zunzunegui, Maria Victoria

    2016-01-01

    Aim To assess the reliability and the validity of Portuguese- and Spanish-translated versions of the video-based short-form Mobility Assessment Tool in assessing self-reported mobility, and to provide evidence for the applicability of these videos in elderly Latin American populations as a complement to physical performance measures. Methods The sample consisted of 300 elderly participants (150 from Brazil, 150 from Colombia) recruited at neighborhood social centers. Mobility was assessed with the Mobility Assessment Tool, and compared with the Short Physical Performance Battery score and self-reported functional limitations. Reliability was calculated using intraclass correlation coefficients. Multiple linear regression analyses were used to assess associations among mobility assessment tools and health, and sociodemographic variables. Results A significant gradient of increasing Mobility Assessment Tool score with better physical function was observed for both self-reported and objective measures, and in each city. Associations between self-reported mobility and health were strong, and significant. Mobility Assessment Tool scores were lower in women at both sites. Intraclass correlation coefficients of the Mobility Assessment Tool were 0.94 (95% confidence interval 0.90–0.97) in Brazil and 0.81 (95% confidence interval 0.66–0.91) in Colombia. Mobility Assessment Tool scores were lower in Manizales than in Natal after adjustment by Short Physical Performance Battery, self-rated health and sex. Conclusions These results provide evidence for high reliability and good validity of the Mobility Assessment Tool in its Spanish and Portuguese versions used in Latin American populations. In addition, the Mobility Assessment Tool can detect mobility differences related to environmental features that cannot be captured by objective perfor mance measures. PMID:24666718

  6. Accuracy of Nutritional Screening Tools in Assessing the Risk of Undernutrition in Hospitalized Children.

    PubMed

    Huysentruyt, Koen; Devreker, Thierry; Dejonckheere, Joachim; De Schepper, Jean; Vandenplas, Yvan; Cools, Filip

    2015-08-01

    The aim of the present study was to evaluate the predictive accuracy of screening tools for assessing nutritional risk in hospitalized children in developed countries. The study involved a systematic review of literature (MEDLINE, EMBASE, and Cochrane Central databases up to January 17, 2014) of studies on the diagnostic performance of pediatric nutritional screening tools. Methodological quality was assessed using a modified QUADAS tool. Sensitivity and specificity were calculated for each screening tool per validation method. A meta-analysis was performed to estimate the risk ratio of different screening result categories of being truly at nutritional risk. A total of 11 studies were included on ≥1 of the following screening tools: Pediatric Nutritional Risk Score, Screening Tool for the Assessment of Malnutrition in Paediatrics, Paediatric Yorkhill Malnutrition Score, and Screening Tool for Risk on Nutritional Status and Growth. Because of variation in reference standards, a direct comparison of the predictive accuracy of the screening tools was not possible. A meta-analysis was performed on 1629 children from 7 different studies. The risk ratio of being truly at nutritional risk was 0.349 (95% confidence interval [CI] 0.16-0.78) for children in the low versus moderate screening category and 0.292 (95% CI 0.19-0.44) in the moderate versus high screening category. There is insufficient evidence to choose 1 nutritional screening tool over another based on their predictive accuracy. The estimated risk of being at "true nutritional risk" increases with each category of screening test result. Each screening category should be linked to a specific course of action, although further research is needed.

  7. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  8. The iPlant collaborative: cyberinfrastructure for enabling data to discovery for the life sciences

    USDA-ARS?s Scientific Manuscript database

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identify management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning m...

  9. A parallel calibration utility for WRF-Hydro on high performance computers

    NASA Astrophysics Data System (ADS)

    Wang, J.; Wang, C.; Kotamarthi, V. R.

    2017-12-01

    A successful modeling of complex hydrological processes comprises establishing an integrated hydrological model which simulates the hydrological processes in each water regime, calibrates and validates the model performance based on observation data, and estimates the uncertainties from different sources especially those associated with parameters. Such a model system requires large computing resources and often have to be run on High Performance Computers (HPC). The recently developed WRF-Hydro modeling system provides a significant advancement in the capability to simulate regional water cycles more completely. The WRF-Hydro model has a large range of parameters such as those in the input table files — GENPARM.TBL, SOILPARM.TBL and CHANPARM.TBL — and several distributed scaling factors such as OVROUGHRTFAC. These parameters affect the behavior and outputs of the model and thus may need to be calibrated against the observations in order to obtain a good modeling performance. Having a parameter calibration tool specifically for automate calibration and uncertainty estimates of WRF-Hydro model can provide significant convenience for the modeling community. In this study, we developed a customized tool using the parallel version of the model-independent parameter estimation and uncertainty analysis tool, PEST, to enabled it to run on HPC with PBS and SLURM workload manager and job scheduler. We also developed a series of PEST input file templates that are specifically for WRF-Hydro model calibration and uncertainty analysis. Here we will present a flood case study occurred in April 2013 over Midwest. The sensitivity and uncertainties are analyzed using the customized PEST tool we developed.

  10. Review of Real-Time Simulator and the Steps Involved for Implementation of a Model from MATLAB/SIMULINK to Real-Time

    NASA Astrophysics Data System (ADS)

    Mikkili, Suresh; Panda, Anup Kumar; Prattipati, Jayanthi

    2015-06-01

    Nowadays the researchers want to develop their model in real-time environment. Simulation tools have been widely used for the design and improvement of electrical systems since the mid twentieth century. The evolution of simulation tools has progressed in step with the evolution of computing technologies. In recent years, computing technologies have improved dramatically in performance and become widely available at a steadily decreasing cost. Consequently, simulation tools have also seen dramatic performance gains and steady cost decreases. Researchers and engineers now have the access to affordable, high performance simulation tools that were previously too cost prohibitive, except for the largest manufacturers. This work has introduced a specific class of digital simulator known as a real-time simulator by answering the questions "what is real-time simulation", "why is it needed" and "how it works". The latest trend in real-time simulation consists of exporting simulation models to FPGA. In this article, the Steps involved for implementation of a model from MATLAB to REAL-TIME are provided in detail.

  11. Next Generation Nuclear Plant Methods Research and Development Technical Program Plan -- PLN-2498

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg

    2008-09-01

    One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less

  12. Next Generation Nuclear Plant Methods Technical Program Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg

    2010-12-01

    One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less

  13. Next Generation Nuclear Plant Methods Technical Program Plan -- PLN-2498

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg

    2010-09-01

    One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less

  14. Improved MRF spot characterization with QIS metrology

    NASA Astrophysics Data System (ADS)

    Westover, Sandi; Hall, Christopher; DeMarco, Michael

    2013-09-01

    Careful characterization of the removal function of sub-aperture polishing tools is critical for optimum polishing results. Magnetorheological finishing (MRF®) creates a polishing tool, or "spot", that is unique both for its locally high removal rate and high slope content. For a variety of reasons, which will be discussed, longer duration spots are beneficial to improving MRF performance, but longer spots yield higher slopes rendering them difficult to measure with adequate fidelity. QED's Interferometer for Stitching (QIS™) was designed to measure the high slope content inherent to non-null sub-aperture stitching interferometry of aspheres. Based on this unique capability the QIS was recently used to measure various MRF spots in an attempt to see if there was a corresponding improvement in MRF performance as a result of improved knowledge of these longer duration spots. The results of these tests will be presented and compared with those of a standard general purpose interferometer.

  15. GAC: Gene Associations with Clinical, a web based application.

    PubMed

    Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne

    2017-01-01

    We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC.  Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data.  In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC.

  16. An evaluation of copy number variation detection tools for cancer using whole exome sequencing data.

    PubMed

    Zare, Fatima; Dow, Michelle; Monteleone, Nicholas; Hosny, Abdelrahman; Nabavi, Sheida

    2017-05-31

    Recently copy number variation (CNV) has gained considerable interest as a type of genomic/genetic variation that plays an important role in disease susceptibility. Advances in sequencing technology have created an opportunity for detecting CNVs more accurately. Recently whole exome sequencing (WES) has become primary strategy for sequencing patient samples and study their genomics aberrations. However, compared to whole genome sequencing, WES introduces more biases and noise that make CNV detection very challenging. Additionally, tumors' complexity makes the detection of cancer specific CNVs even more difficult. Although many CNV detection tools have been developed since introducing NGS data, there are few tools for somatic CNV detection for WES data in cancer. In this study, we evaluated the performance of the most recent and commonly used CNV detection tools for WES data in cancer to address their limitations and provide guidelines for developing new ones. We focused on the tools that have been designed or have the ability to detect cancer somatic aberrations. We compared the performance of the tools in terms of sensitivity and false discovery rate (FDR) using real data and simulated data. Comparative analysis of the results of the tools showed that there is a low consensus among the tools in calling CNVs. Using real data, tools show moderate sensitivity (~50% - ~80%), fair specificity (~70% - ~94%) and poor FDRs (~27% - ~60%). Also, using simulated data we observed that increasing the coverage more than 10× in exonic regions does not improve the detection power of the tools significantly. The limited performance of the current CNV detection tools for WES data in cancer indicates the need for developing more efficient and precise CNV detection methods. Due to the complexity of tumors and high level of noise and biases in WES data, employing advanced novel segmentation, normalization and de-noising techniques that are designed specifically for cancer data is necessary. Also, CNV detection development suffers from the lack of a gold standard for performance evaluation. Finally, developing tools with user-friendly user interfaces and visualization features can enhance CNV studies for a broader range of users.

  17. Dutch national survey to test the STRONGkids nutritional risk screening tool in hospitalized children.

    PubMed

    Hulst, Jessie M; Zwart, Henrike; Hop, Wim C; Joosten, Koen F M

    2010-02-01

    Children admitted to the hospital are at risk of developing malnutrition. The aim of the present study was to investigate the feasibility and value of a new nutritional risk screening tool, called STRONG(kids), in a nationwide study. A Prospective observational multi-centre study was performed in 44 Dutch hospitals (7 academic and 37 general), over three consecutive days during the month of November 2007. The STRONG(kids) screening tool consisted of 4 items: (1) subjective clinical assessment, (2) high risk disease, (3) nutritional intake, (4) weight loss. Measurements of weight and length were performed. SD-scores <-2 for weight-for-height and height-for-age were considered to indicate acute and chronic malnutrition respectively. A total of 424 children were included. Median age was 3.5 years and median hospital stay was 2 days. Sixty-two percent of the children were classified "at risk" of developing malnutrition by the STRONG(kids) tool. Children at risk had significantly lower SD-scores for weight-for-height, a higher prevalence of acute malnutrition and a longer hospital stay compared to children with no nutritional risk. The nutritional risk screening tool STRONG(kids) was successfully applied to 98% of the children. Using this tool, a significant relationship was found between having a "high risk" score, a negative SD-score in weight-for-height and a prolonged hospital stay. Copyright 2009 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  18. Multicenter Validation of a Customizable Scoring Tool for Selection of Trainees for a Residency or Fellowship Program. The EAST-IST Study.

    PubMed

    Bosslet, Gabriel T; Carlos, W Graham; Tybor, David J; McCallister, Jennifer; Huebert, Candace; Henderson, Ashley; Miles, Matthew C; Twigg, Homer; Sears, Catherine R; Brown, Cynthia; Farber, Mark O; Lahm, Tim; Buckley, John D

    2017-04-01

    Few data have been published regarding scoring tools for selection of postgraduate medical trainee candidates that have wide applicability. The authors present a novel scoring tool developed to assist postgraduate programs in generating an institution-specific rank list derived from selected elements of the U.S. Electronic Residency Application System (ERAS) application. The authors developed and validated an ERAS and interview day scoring tool at five pulmonary and critical care fellowship programs: the ERAS Application Scoring Tool-Interview Scoring Tool. This scoring tool was then tested for intrarater correlation versus subjective rankings of ERAS applications. The process for development of the tool was performed at four other institutions, and it was performed alongside and compared with the "traditional" ranking methods at the five programs and compared with the submitted National Residency Match Program rank list. The ERAS Application Scoring Tool correlated highly with subjective faculty rankings at the primary institution (average Spearman's r = 0.77). The ERAS Application Scoring Tool-Interview Scoring Tool method correlated well with traditional ranking methodology at all five institutions (Spearman's r = 0.54, 0.65, 0.72, 0.77, and 0.84). This study validates a process for selecting and weighting components of the ERAS application and interview day to create a customizable, institution-specific tool for ranking candidates to postgraduate medical education programs. This scoring system can be used in future studies to compare the outcomes of fellowship training.

  19. Learning in an interactive simulation tool against landslide risks: the role of strength and availability of experiential feedback

    NASA Astrophysics Data System (ADS)

    Chaturvedi, Pratik; Arora, Akshit; Dutt, Varun

    2018-06-01

    Feedback via simulation tools is likely to help people improve their decision-making against natural disasters. However, little is known on how differing strengths of experiential feedback and feedback's availability in simulation tools influence people's decisions against landslides. We tested the influence of differing strengths of experiential feedback and feedback's availability on people's decisions against landslides in Mandi, Himachal Pradesh, India. Experiential feedback (high or low) and feedback's availability (present or absent) were varied across four between-subject conditions in a tool called the Interactive Landslide Simulation (ILS): high damage with feedback present, high damage with feedback absent, low damage with feedback present, and low damage with feedback absent. In high-damage conditions, the probabilities of damages to life and property due to landslides were 10 times higher than those in the low-damage conditions. In feedback-present conditions, experiential feedback was provided in numeric, text, and graphical formats in ILS. In feedback-absent conditions, the probabilities of damages were described; however, there was no experiential feedback present. Investments were greater in conditions where experiential feedback was present and damages were high compared to conditions where experiential feedback was absent and damages were low. Furthermore, only high-damage feedback produced learning in ILS. Simulation tools like ILS seem appropriate for landslide risk communication and for performing what-if analyses.

  20. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  1. Development and Validation of a Low Cost, Flexible, Open Source Robot for Use as a Teaching and Research Tool across the Educational Spectrum

    ERIC Educational Resources Information Center

    Howell, Abraham L.

    2012-01-01

    In the high tech factories of today robots can be used to perform various tasks that span a wide spectrum that encompasses the act of performing high-speed, automated assembly of cell phones, laptops and other electronic devices to the compounding, filling, packaging and distribution of life-saving pharmaceuticals. As robot usage continues to…

  2. Exploring Systems That Support Good Clinical Care in Indigenous Primary Health-care Services: A Retrospective Analysis of Longitudinal Systems Assessment Tool Data from High-Improving Services.

    PubMed

    Woods, Cindy; Carlisle, Karen; Larkins, Sarah; Thompson, Sandra Claire; Tsey, Komla; Matthews, Veronica; Bailie, Ross

    2017-01-01

    Continuous Quality Improvement is a process for raising the quality of primary health care (PHC) across Indigenous PHC services. In addition to clinical auditing using plan, do, study, and act cycles, engaging staff in a process of reflecting on systems to support quality care is vital. The One21seventy Systems Assessment Tool (SAT) supports staff to assess systems performance in terms of five key components. This study examines quantitative and qualitative SAT data from five high-improving Indigenous PHC services in northern Australia to understand the systems used to support quality care. High-improving services selected for the study were determined by calculating quality of care indices for Indigenous health services participating in the Audit and Best Practice in Chronic Disease National Research Partnership. Services that reported continuing high improvement in quality of care delivered across two or more audit tools in three or more audits were selected for the study. Precollected SAT data (from annual team SAT meetings) are presented longitudinally using radar plots for quantitative scores for each component, and content analysis is used to describe strengths and weaknesses of performance in each systems' component. High-improving services were able to demonstrate strong processes for assessing system performance and consistent improvement in systems to support quality care across components. Key strengths in the quality support systems included adequate and orientated workforce, appropriate health system supports, and engagement with other organizations and community, while the weaknesses included lack of service infrastructure, recruitment, retention, and support for staff and additional costs. Qualitative data revealed clear voices from health service staff expressing concerns with performance, and subsequent SAT data provided evidence of changes made to address concerns. Learning from the processes and strengths of high-improving services may be useful as we work with services striving to improve the quality of care provided in other areas.

  3. High-performance web services for querying gene and variant annotation.

    PubMed

    Xin, Jiwen; Mark, Adam; Afrasiabi, Cyrus; Tsueng, Ginger; Juchler, Moritz; Gopal, Nikhil; Stupp, Gregory S; Putman, Timothy E; Ainscough, Benjamin J; Griffith, Obi L; Torkamani, Ali; Whetzel, Patricia L; Mungall, Christopher J; Mooney, Sean D; Su, Andrew I; Wu, Chunlei

    2016-05-06

    Efficient tools for data management and integration are essential for many aspects of high-throughput biology. In particular, annotations of genes and human genetic variants are commonly used but highly fragmented across many resources. Here, we describe MyGene.info and MyVariant.info, high-performance web services for querying gene and variant annotation information. These web services are currently accessed more than three million times permonth. They also demonstrate a generalizable cloud-based model for organizing and querying biological annotation information. MyGene.info and MyVariant.info are provided as high-performance web services, accessible at http://mygene.info and http://myvariant.info . Both are offered free of charge to the research community.

  4. Efficient Use of Distributed Systems for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Valerie; Chen, Jian; Canfield, Thomas; Richard, Jacques

    2000-01-01

    Distributed computing has been regarded as the future of high performance computing. Nationwide high speed networks such as vBNS are becoming widely available to interconnect high-speed computers, virtual environments, scientific instruments and large data sets. One of the major issues to be addressed with distributed systems is the development of computational tools that facilitate the efficient execution of parallel applications on such systems. These tools must exploit the heterogeneous resources (networks and compute nodes) in distributed systems. This paper presents a tool, called PART, which addresses this issue for mesh partitioning. PART takes advantage of the following heterogeneous system features: (1) processor speed; (2) number of processors; (3) local network performance; and (4) wide area network performance. Further, different finite element applications under consideration may have different computational complexities, different communication patterns, and different element types, which also must be taken into consideration when partitioning. PART uses parallel simulated annealing to partition the domain, taking into consideration network and processor heterogeneity. The results of using PART for an explicit finite element application executing on two IBM SPs (located at Argonne National Laboratory and the San Diego Supercomputer Center) indicate an increase in efficiency by up to 36% as compared to METIS, a widely used mesh partitioning tool. The input to METIS was modified to take into consideration heterogeneous processor performance; METIS does not take into consideration heterogeneous networks. The execution times for these applications were reduced by up to 30% as compared to METIS. These results are given in Figure 1 for four irregular meshes with number of elements ranging from 30,269 elements for the Barth5 mesh to 11,451 elements for the Barth4 mesh. Future work with PART entails using the tool with an integrated application requiring distributed systems. In particular this application, illustrated in the document entails an integration of finite element and fluid dynamic simulations to address the cooling of turbine blades of a gas turbine engine design. It is not uncommon to encounter high-temperature, film-cooled turbine airfoils with 1,000,000s of degrees of freedom. This results because of the complexity of the various components of the airfoils, requiring fine-grain meshing for accuracy. Additional information is contained in the original.

  5. High-density fuel effects. Final report, September 1985-April 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rizk, N.K.; Oechsie, V.L.; Ross, P.T.

    1988-08-18

    The purpose of this program was to determine, by combustor rig tests and data evaluation, the effects of the high-density fuel properties on the performance and durability of the Allison T56-A-15 combustion system. Four high-density fuels in addition to baseline JP4 were evaluated in the effort. The rig-test program included: nozzle-flow bench testing, aerothermal performance and wall temperature, flame stability and ignition, injector coking and plugging, and flow-transient effect. The data-evaluation effort involved the utilization of empirical correlations in addition to analytical multidimensional tools to analyze the performance of the combustor. The modifications required to optimize the performance with high-densitymore » fuels were suggested and the expected improvement in performance was evaluated.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roger Lew; Ronald L. Boring; Thomas A. Ulrich

    Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.

  7. Development of Low Global Warming Potential Refrigerant Solutions for Commercial Refrigeration Systems using a Life Cycle Climate Performance Design Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdelaziz, Omar; Fricke, Brian A; Vineyard, Edward Allan

    Commercial refrigeration systems are known to be prone to high leak rates and to consume large amounts of electricity. As such, direct emissions related to refrigerant leakage and indirect emissions resulting from primary energy consumption contribute greatly to their Life Cycle Climate Performance (LCCP). In this paper, an LCCP design tool is used to evaluate the performance of a typical commercial refrigeration system with alternative refrigerants and minor system modifications to provide lower Global Warming Potential (GWP) refrigerant solutions with improved LCCP compared to baseline systems. The LCCP design tool accounts for system performance, ambient temperature, and system load; systemmore » performance is evaluated using a validated vapor compression system simulation tool while ambient temperature and system load are devised from a widely used building energy modeling tool (EnergyPlus). The LCCP design tool also accounts for the change in hourly electricity emission rate to yield an accurate prediction of indirect emissions. The analysis shows that conventional commercial refrigeration system life cycle emissions are largely due to direct emissions associated with refrigerant leaks and that system efficiency plays a smaller role in the LCCP. However, as a transition occurs to low GWP refrigerants, the indirect emissions become more relevant. Low GWP refrigerants may not be suitable for drop-in replacements in conventional commercial refrigeration systems; however some mixtures may be introduced as transitional drop-in replacements. These transitional refrigerants have a significantly lower GWP than baseline refrigerants and as such, improved LCCP. The paper concludes with a brief discussion on the tradeoffs between refrigerant GWP, efficiency and capacity.« less

  8. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  9. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  10. Review of nutritional screening and assessment tools and clinical outcomes in heart failure.

    PubMed

    Lin, Hong; Zhang, Haifeng; Lin, Zheng; Li, Xinli; Kong, Xiangqin; Sun, Gouzhen

    2016-09-01

    Recent studies have suggested that undernutrition as defined using multidimensional nutritional evaluation tools may affect clinical outcomes in heart failure (HF). The evidence supporting this correlation is unclear. Therefore, we conducted this systematic review to critically appraise the use of multidimensional evaluation tools in the prediction of clinical outcomes in HF. We performed descriptive analyses of all identified articles involving qualitative analyses. We used STATA to conduct meta-analyses when at least three studies that tested the same type of nutritional assessment or screening tools and used the same outcome were identified. Sensitivity analyses were conducted to validate our positive results. We identified 17 articles with qualitative analyses and 11 with quantitative analysis after comprehensive literature searching and screening. We determined that the prevalence of malnutrition is high in HF (range 16-90 %), particularly in advanced and acute decompensated HF (approximate range 75-90 %). Undernutrition as identified by multidimensional evaluation tools may be significantly associated with hospitalization, length of stay and complications and is particularly strongly associated with high mortality. The meta-analysis revealed that compared with other tools, Mini Nutritional Assessment (MNA) scores were the strongest predictors of mortality in HF [HR (4.32, 95 % CI 2.30-8.11)]. Our results remained reliable after conducting sensitivity analyses. The prevalence of malnutrition is high in HF, particularly in advanced and acute decompensated HF. Moreover, undernutrition as identified by multidimensional evaluation tools is significantly associated with unfavourable prognoses and high mortality in HF.

  11. Woods. Industrial Arts. Performance Objectives. Junior High School.

    ERIC Educational Resources Information Center

    Bunch, Edwood; And Others

    Several intermediate performance objectives and corresponding criterion measures are listed for a woodworking course for seventh, eighth, and ninth grade students. The seventh grade section includes seven terminal objectives for a 9-week basic hand woodworking course which includes planning and layout, skill in the use of hand tools, construction…

  12. Industrial Electronics. Performance Objectives. Basic Course.

    ERIC Educational Resources Information Center

    Tiffany, Earl

    Several intermediate performance objectives and corresponding criterion measures are listed for each of 30 terminal objectives for a two-semester (2 hours daily) high school course in basic industrial electronics. The objectives cover instruction in basic electricity including AC-DC theory, magnetism, electrical safety, care and use of hand tools,…

  13. Assessing Performance When the Stakes are High.

    ERIC Educational Resources Information Center

    Crawford, William R.

    This paper is concerned with measuring achievement levels of medical students. Precise tools are needed to assess the readiness of an individual to practice. The basic question then becomes, what can this candidate do, at a given time, under given circumstances. Given the definition of the circumstances, and the candidate's performance, the…

  14. The reliability of workplace-based assessment in postgraduate medical education and training: a national evaluation in general practice in the United Kingdom.

    PubMed

    Murphy, Douglas J; Bruce, David A; Mercer, Stewart W; Eva, Kevin W

    2009-05-01

    To investigate the reliability and feasibility of six potential workplace-based assessment methods in general practice training: criterion audit, multi-source feedback from clinical and non-clinical colleagues, patient feedback (the CARE Measure), referral letters, significant event analysis, and video analysis of consultations. Performance of GP registrars (trainees) was evaluated with each tool to assess the reliabilities of the tools and feasibility, given raters and number of assessments needed. Participant experience of process determined by questionnaire. 171 GP registrars and their trainers, drawn from nine deaneries (representing all four countries in the UK), participated. The ability of each tool to differentiate between doctors (reliability) was assessed using generalisability theory. Decision studies were then conducted to determine the number of observations required to achieve an acceptably high reliability for "high-stakes assessment" using each instrument. Finally, descriptive statistics were used to summarise participants' ratings of their experience using these tools. Multi-source feedback from colleagues and patient feedback on consultations emerged as the two methods most likely to offer a reliable and feasible opinion of workplace performance. Reliability co-efficients of 0.8 were attainable with 41 CARE Measure patient questionnaires and six clinical and/or five non-clinical colleagues per doctor when assessed on two occasions. For the other four methods tested, 10 or more assessors were required per doctor in order to achieve a reliable assessment, making the feasibility of their use in high-stakes assessment extremely low. Participant feedback did not raise any major concerns regarding the acceptability, feasibility, or educational impact of the tools. The combination of patient and colleague views of doctors' performance, coupled with reliable competence measures, may offer a suitable evidence-base on which to monitor progress and completion of doctors' training in general practice.

  15. Research into the interaction between high performance and cognitive skills in an intelligent tutoring system

    NASA Technical Reports Server (NTRS)

    Fink, Pamela K.

    1991-01-01

    Two intelligent tutoring systems were developed. These tutoring systems are being used to study the effectiveness of intelligent tutoring systems in training high performance tasks and the interrelationship of high performance and cognitive tasks. The two tutoring systems, referred to as the Console Operations Tutors, were built using the same basic approach to the design of an intelligent tutoring system. This design approach allowed researchers to more rapidly implement the cognitively based tutor, the OMS Leak Detect Tutor, by using the foundation of code generated in the development of the high performance based tutor, the Manual Select Keyboard (MSK). It is believed that the approach can be further generalized to develop a generic intelligent tutoring system implementation tool.

  16. Performance evaluation of inpatient service in Beijing: a horizontal comparison with risk adjustment based on Diagnosis Related Groups

    PubMed Central

    Jian, Weiyan; Huang, Yinmin; Hu, Mu; Zhang, Xiumei

    2009-01-01

    Background The medical performance evaluation, which provides a basis for rational decision-making, is an important part of medical service research. Current progress with health services reform in China is far from satisfactory, without sufficient regulation. To achieve better progress, an effective tool for evaluating medical performance needs to be established. In view of this, this study attempted to develop such a tool appropriate for the Chinese context. Methods Data was collected from the front pages of medical records (FPMR) of all large general public hospitals (21 hospitals) in the third and fourth quarter of 2007. Locally developed Diagnosis Related Groups (DRGs) were introduced as a tool for risk adjustment and performance evaluation indicators were established: Charge Efficiency Index (CEI), Time Efficiency Index (TEI) and inpatient mortality of low-risk group cases (IMLRG), to reflect respectively work efficiency and medical service quality. Using these indicators, the inpatient services' performance was horizontally compared among hospitals. Case-mix Index (CMI) was used to adjust efficiency indices and then produce adjusted CEI (aCEI) and adjusted TEI (aTEI). Poisson distribution analysis was used to test the statistical significance of the IMLRG differences between different hospitals. Results Using the aCEI, aTEI and IMLRG scores for the 21 hospitals, Hospital A and C had relatively good overall performance because their medical charges were lower, LOS shorter and IMLRG smaller. The performance of Hospital P and Q was the worst due to their relatively high charge level, long LOS and high IMLRG. Various performance problems also existed in the other hospitals. Conclusion It is possible to develop an accurate and easy to run performance evaluation system using Case-Mix as the tool for risk adjustment, choosing indicators close to consumers and managers, and utilizing routine report forms as the basic information source. To keep such a system running effectively, it is necessary to improve the reliability of clinical information and the risk-adjustment ability of Case-Mix. PMID:19402913

  17. Identifying substance misuse in primary care: TAPS Tool compared to the WHO ASSIST.

    PubMed

    Schwartz, R P; McNeely, J; Wu, L T; Sharma, G; Wahle, A; Cushing, C; Nordeck, C D; Sharma, A; O'Grady, K E; Gryczynski, J; Mitchell, S G; Ali, R L; Marsden, J; Subramaniam, G A

    2017-05-01

    There is a need for screening and brief assessment instruments to identify primary care patients with substance use problems. This study's aim was to examine the performance of a two-step screening and brief assessment instrument, the TAPS Tool, compared to the WHO ASSIST. Two thousand adult primary care patients recruited from five primary care clinics in four Eastern US states completed the TAPS Tool followed by the ASSIST. The ability of the TAPS Tool to identify moderate- and high-risk use scores on the ASSIST was examined using sensitivity and specificity analyses. The interviewer and self-administered computer tablet versions of the TAPS Tool generated similar results. The interviewer-administered version (at cut-off of 2), had acceptable sensitivity and specificity for high-risk tobacco (0.90 and 0.77) and alcohol (0.87 and 0.80) use. For illicit drugs, sensitivities were >0.82 and specificities >0.92. The TAPS (at a cut-off of 1) had good sensitivity and specificity for moderate-risk tobacco use (0.83 and 0.97) and alcohol (0.83 and 0.74). Among illicit drugs, sensitivity was acceptable for moderate-risk of marijuana (0.71), while it was low for all other illicit drugs and non-medical use of prescription medications. Specificities were 0.97 or higher for all illicit drugs and prescription medications. The TAPS Tool identified adult primary care patients with high-risk ASSIST scores for all substances as well moderate-risk users of tobacco, alcohol, and marijuana, although it did not perform well in identifying patients with moderate-risk use of other drugs or non-medical use of prescription medications. The advantages of the TAPS Tool over the ASSIST are its more limited number of items and focus solely on substance use in the past 3months. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Spacecraft Multiple Array Communication System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  19. The Effect of Lower Body Stabilization and Different Writing Tools on Writing Biomechanics in Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Cheng, Hsin-Yi Kathy; Lien, Yueh-Ju; Yu, Yu-Chun; Ju, Yan-Ying; Pei, Yu-Cheng; Cheng, Chih-Hsiu; Wu, David Bin-Chia

    2013-01-01

    A high percentage of children with cerebral palsy (CP) have difficulty keeping up with the handwriting demands at school. Previous studies have addressed the effects of proper sitting and writing tool on writing performance, but less on body biomechanics. The aim of this study was to investigate the influence of lower body stabilization and pencil…

  20. Informatics Tools to Improve Clinical Research

    PubMed Central

    Argraves, S; Brandt, CA; Money, R; Nadkarni, P

    2005-01-01

    During the conduct of complex clinical trials, there are numerous sources and types of data collection and project coordination problems. Methods and approaches to address the conduct of a trial vary in both the cost and time to perform and the potential benefit. Informatics tools can help trial coordinators and investigators ensure the collection of high quality research data during all phases of a clinical trial. PMID:16779170

  1. The Effects of the Use of Microsoft Math Tool (Graphical Calculator) Instruction on Students' Performance in Linear Functions

    ERIC Educational Resources Information Center

    Kissi, Philip Siaw; Opoku, Gyabaah; Boateng, Sampson Kwadwo

    2016-01-01

    The aim of the study was to investigate the effect of Microsoft Math Tool (graphical calculator) on students' achievement in the linear function. The study employed Quasi-experimental research design (Pre-test Post-test two group designs). A total of ninety-eight (98) students were selected for the study from two different Senior High Schools…

  2. Continued Development of Expert System Tools for NPSS Engine Diagnostics

    NASA Technical Reports Server (NTRS)

    Lewandowski, Henry

    1996-01-01

    The objectives of this grant were to work with previously developed NPSS (Numerical Propulsion System Simulation) tools and enhance their functionality; explore similar AI systems; and work with the High Performance Computing Communication (HPCC) K-12 program. Activities for this reporting period are briefly summarized and a paper addressing the implementation, monitoring and zooming in a distributed jet engine simulation is included as an attachment.

  3. vvtools v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, Richard R.

    Vvtools is a suite of testing tools, with a focus on reproducible verification and validation. They are written in pure Python, and contain a test harness and an automated process management tool. Users of vvtools can develop suites of verification and validation tests and run them on small to large high performance computing resources in an automated and reproducible way. The test harness enables complex processes to be performed in each test and even supports a one-level parent/child dependency between tests. It includes a built in capability to manage workloads requiring multiple processors and platforms that use batch queueing systems.

  4. Scanner baseliner monitoring and control in high volume manufacturing

    NASA Astrophysics Data System (ADS)

    Samudrala, Pavan; Chung, Woong Jae; Aung, Nyan; Subramany, Lokesh; Gao, Haiyong; Gomez, Juan-Manuel

    2016-03-01

    We analyze performance of different customized models on baseliner overlay data and demonstrate the reduction in overlay residuals by ~10%. Smart Sampling sets were assessed and compared with the full wafer measurements. We found that performance of the grid can still be maintained by going to one-third of total sampling points, while reducing metrology time by 60%. We also demonstrate the feasibility of achieving time to time matching using scanner fleet manager and thus identify the tool drifts even when the tool monitoring controls are within spec limits. We also explore the scanner feedback constant variation with illumination sources.

  5. A Bayesian Performance Prediction Model for Mathematics Education: A Prototypical Approach for Effective Group Composition

    ERIC Educational Resources Information Center

    Bekele, Rahel; McPherson, Maggie

    2011-01-01

    This research work presents a Bayesian Performance Prediction Model that was created in order to determine the strength of personality traits in predicting the level of mathematics performance of high school students in Addis Ababa. It is an automated tool that can be used to collect information from students for the purpose of effective group…

  6. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  7. Comparison of in silico models for prediction of mutagenicity.

    PubMed

    Bakhtyari, Nazanin G; Raitano, Giuseppa; Benfenati, Emilio; Martin, Todd; Young, Douglas

    2013-01-01

    Using a dataset with more than 6000 compounds, the performance of eight quantitative structure activity relationships (QSAR) models was evaluated: ACD/Tox Suite, Absorption, Distribution, Metabolism, Elimination, and Toxicity of chemical substances (ADMET) predictor, Derek, Toxicity Estimation Software Tool (T.E.S.T.), TOxicity Prediction by Komputer Assisted Technology (TOPKAT), Toxtree, CEASAR, and SARpy (SAR in python). In general, the results showed a high level of performance. To have a realistic estimate of the predictive ability, the results for chemicals inside and outside the training set for each model were considered. The effect of applicability domain tools (when available) on the prediction accuracy was also evaluated. The predictive tools included QSAR models, knowledge-based systems, and a combination of both methods. Models based on statistical QSAR methods gave better results.

  8. Optimization of Mud Hammer Drilling Performance--A Program to Benchmark the Viability of Advanced Mud Hammer Drilling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnis Judzis

    2006-03-01

    Operators continue to look for ways to improve hard rock drilling performance through emerging technologies. A consortium of Department of Energy, operator and industry participants put together an effort to test and optimize mud driven fluid hammers as one emerging technology that has shown promise to increase penetration rates in hard rock. The thrust of this program has been to test and record the performance of fluid hammers in full scale test conditions including, hard formations at simulated depth, high density/high solids drilling muds, and realistic fluid power levels. This paper details the testing and results of testing two 7more » 3/4 inch diameter mud hammers with 8 1/2 inch hammer bits. A Novatek MHN5 and an SDS Digger FH185 mud hammer were tested with several bit types, with performance being compared to a conventional (IADC Code 537) tricone bit. These tools functionally operated in all of the simulated downhole environments. The performance was in the range of the baseline ticone or better at lower borehole pressures, but at higher borehole pressures the performance was in the lower range or below that of the baseline tricone bit. A new drilling mode was observed, while operating the MHN5 mud hammer. This mode was noticed as the weight on bit (WOB) was in transition from low to high applied load. During this new ''transition drilling mode'', performance was substantially improved and in some cases outperformed the tricone bit. Improvements were noted for the SDS tool while drilling with a more aggressive bit design. Future work includes the optimization of these or the next generation tools for operating in higher density and higher borehole pressure conditions and improving bit design and technology based on the knowledge gained from this test program.« less

  9. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    PubMed

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  10. Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis

    PubMed Central

    Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven

    2011-01-01

    Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320

  11. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2010-12-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling composite materials.

  12. Review on advanced composite materials boring mechanism and tools

    NASA Astrophysics Data System (ADS)

    Shi, Runping; Wang, Chengyong

    2011-05-01

    With the rapid development of aviation and aerospace manufacturing technology, advanced composite materials represented by carbon fibre reinforced plastics (CFRP) and super hybrid composites (fibre/metal plates) are more and more widely applied. The fibres are mainly carbon fibre, boron fibre, Aramid fiber and Sic fibre. The matrixes are resin matrix, metal matrix and ceramic matrix. Advanced composite materials have higher specific strength and higher specific modulus than glass fibre reinforced resin composites of the 1st generation. They are widely used in aviation and aerospace industry due to their high specific strength, high specific modulus, excellent ductility, anticorrosion, heat-insulation, sound-insulation, shock absorption and high&low temperature resistance. They are used for radomes, inlets, airfoils(fuel tank included), flap, aileron, vertical tail, horizontal tail, air brake, skin, baseboards and tails, etc. Its hardness is up to 62~65HRC. The holes are greatly affected by the fibre laminates direction of carbon fibre reinforced composite material due to its anisotropy when drilling in unidirectional laminates. There are burrs, splits at the exit because of stress concentration. Besides there is delamination and the hole is prone to be smaller. Burrs are caused by poor sharpness of cutting edge, delamination, tearing, splitting are caused by the great stress caused by high thrust force. Poorer sharpness of cutting edge leads to lower cutting performance and higher drilling force at the same time. The present research focuses on the interrelation between rotation speed, feed, drill's geometry, drill life, cutting mode, tools material etc. and thrust force. At the same time, holes quantity and holes making difficulty of composites have also increased. It requires high performance drills which won't bring out defects and have long tool life. It has become a trend to develop super hard material tools and tools with special geometry for drilling composite materials.

  13. Development and validation of a web-based questionnaire for surveying the health and working conditions of high-performance marine craft populations

    PubMed Central

    de Alwis, Manudul Pahansen; Lo Martire, Riccardo; Äng, Björn O; Garme, Karl

    2016-01-01

    Background High-performance marine craft crews are susceptible to various adverse health conditions caused by multiple interactive factors. However, there are limited epidemiological data available for assessment of working conditions at sea. Although questionnaire surveys are widely used for identifying exposures, outcomes and associated risks with high accuracy levels, until now, no validated epidemiological tool exists for surveying occupational health and performance in these populations. Aim To develop and validate a web-based questionnaire for epidemiological assessment of occupational and individual risk exposure pertinent to the musculoskeletal health conditions and performance in high-performance marine craft populations. Method A questionnaire for investigating the association between work-related exposure, performance and health was initially developed by a consensus panel under four subdomains, viz. demography, lifestyle, work exposure and health and systematically validated by expert raters for content relevance and simplicity in three consecutive stages, each iteratively followed by a consensus panel revision. The item content validity index (I-CVI) was determined as the proportion of experts giving a rating of 3 or 4. The scale content validity index (S-CVI/Ave) was computed by averaging the I-CVIs for the assessment of the questionnaire as a tool. Finally, the questionnaire was pilot tested. Results The S-CVI/Ave increased from 0.89 to 0.96 for relevance and from 0.76 to 0.94 for simplicity, resulting in 36 items in the final questionnaire. The pilot test confirmed the feasibility of the questionnaire. Conclusions The present study shows that the web-based questionnaire fulfils previously published validity acceptance criteria and is therefore considered valid and feasible for the empirical surveying of epidemiological aspects among high-performance marine craft crews and similar populations. PMID:27324717

  14. Development of the McGill simulator for endoscopic sinus surgery: a new high-fidelity virtual reality simulator for endoscopic sinus surgery.

    PubMed

    Varshney, Rickul; Frenkiel, Saul; Nguyen, Lily H P; Young, Meredith; Del Maestro, Rolando; Zeitouni, Anthony; Tewfik, Marc A

    2014-01-01

    The technical challenges of endoscopic sinus surgery (ESS) and the high risk of complications support the development of alternative modalities to train residents in these procedures. Virtual reality simulation is becoming a useful tool for training the skills necessary for minimally invasive surgery; however, there are currently no ESS virtual reality simulators available with valid evidence supporting their use in resident education. Our aim was to develop a new rhinology simulator, as well as to define potential performance metrics for trainee assessment. The McGill simulator for endoscopic sinus surgery (MSESS), a new sinus surgery virtual reality simulator with haptic feedback, was developed (a collaboration between the McGill University Department of Otolaryngology-Head and Neck Surgery, the Montreal Neurologic Institute Simulation Lab, and the National Research Council of Canada). A panel of experts in education, performance assessment, rhinology, and skull base surgery convened to identify core technical abilities that would need to be taught by the simulator, as well as performance metrics to be developed and captured. The MSESS allows the user to perform basic sinus surgery skills, such as an ethmoidectomy and sphenoidotomy, through the use of endoscopic tools in a virtual nasal model. The performance metrics were developed by an expert panel and include measurements of safety, quality, and efficiency of the procedure. The MSESS incorporates novel technological advancements to create a realistic platform for trainees. To our knowledge, this is the first simulator to combine novel tools such as the endonasal wash and elaborate anatomic deformity with advanced performance metrics for ESS.

  15. Experimental and Numerical Optimization of a High-Lift System to Improve Low-Speed Performance, Stability, and Control of an Arrow-Wing Supersonic Transport

    NASA Technical Reports Server (NTRS)

    Hahne, David E.; Glaab, Louis J.

    1999-01-01

    An investigation was performed to evaluate leading-and trailing-edge flap deflections for optimal aerodynamic performance of a High-Speed Civil Transport concept during takeoff and approach-to-landing conditions. The configuration used for this study was designed by the Douglas Aircraft Company during the 1970's. A 0.1-scale model of this configuration was tested in the Langley 30- by 60-Foot Tunnel with both the original leading-edge flap system and a new leading-edge flap system, which was designed with modem computational flow analysis and optimization tools. Leading-and trailing-edge flap deflections were generated for the original and modified leading-edge flap systems with the computational flow analysis and optimization tools. Although wind tunnel data indicated improvements in aerodynamic performance for the analytically derived flap deflections for both leading-edge flap systems, perturbations of the analytically derived leading-edge flap deflections yielded significant additional improvements in aerodynamic performance. In addition to the aerodynamic performance optimization testing, stability and control data were also obtained. An evaluation of the crosswind landing capability of the aircraft configuration revealed that insufficient lateral control existed as a result of high levels of lateral stability. Deflection of the leading-and trailing-edge flaps improved the crosswind landing capability of the vehicle considerably; however, additional improvements are required.

  16. High efficiency, long life terrestrial solar panel

    NASA Technical Reports Server (NTRS)

    Chao, T.; Khemthong, S.; Ling, R.; Olah, S.

    1977-01-01

    The design of a high efficiency, long life terrestrial module was completed. It utilized 256 rectangular, high efficiency solar cells to achieve high packing density and electrical output. Tooling for the fabrication of solar cells was in house and evaluation of the cell performance was begun. Based on the power output analysis, the goal of a 13% efficiency module was achievable.

  17. Joint Analysis: QDR 2001 and Beyond Mini-Symposium Held in Fairfax, Virginia on 1-3 February 2000

    DTIC Science & Technology

    2001-04-11

    have done better in: * Articulating a high level, understandable story that was credible to Congress. * Documenting, archiving assessments performed ...to (1) examine DoD assessment capabilities for performing QDR 2001, (2) provide a non-confrontational environment in which OSD, the Joint Staff...example. Foc trcues-Ec Key Issues Tools/databases Defined for Three Levels _________ (Low, Med., High ) Scenarios A . Emphasis on Modernization B. Emphasis

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NREL developed a modeling and experimental strategy to characterize thermal performance of materials. The technique provides critical data on thermal properties with relevance for electronics packaging applications. Thermal contact resistance and bulk thermal conductivity were characterized for new high-performance materials such as thermoplastics, boron-nitride nanosheets, copper nanowires, and atomically bonded layers. The technique is an important tool for developing designs and materials that enable power electronics packaging with small footprint, high power density, and low cost for numerous applications.

  19. Divide and Conquer (DC) BLAST: fast and easy BLAST execution within HPC environments

    DOE PAGES

    Yim, Won Cheol; Cushman, John C.

    2017-07-22

    Bioinformatics is currently faced with very large-scale data sets that lead to computational jobs, especially sequence similarity searches, that can take absurdly long times to run. For example, the National Center for Biotechnology Information (NCBI) Basic Local Alignment Search Tool (BLAST and BLAST+) suite, which is by far the most widely used tool for rapid similarity searching among nucleic acid or amino acid sequences, is highly central processing unit (CPU) intensive. While the BLAST suite of programs perform searches very rapidly, they have the potential to be accelerated. In recent years, distributed computing environments have become more widely accessible andmore » used due to the increasing availability of high-performance computing (HPC) systems. Therefore, simple solutions for data parallelization are needed to expedite BLAST and other sequence analysis tools. However, existing software for parallel sequence similarity searches often requires extensive computational experience and skill on the part of the user. In order to accelerate BLAST and other sequence analysis tools, Divide and Conquer BLAST (DCBLAST) was developed to perform NCBI BLAST searches within a cluster, grid, or HPC environment by using a query sequence distribution approach. Scaling from one (1) to 256 CPU cores resulted in significant improvements in processing speed. Thus, DCBLAST dramatically accelerates the execution of BLAST searches using a simple, accessible, robust, and parallel approach. DCBLAST works across multiple nodes automatically and it overcomes the speed limitation of single-node BLAST programs. DCBLAST can be used on any HPC system, can take advantage of hundreds of nodes, and has no output limitations. Thus, this freely available tool simplifies distributed computation pipelines to facilitate the rapid discovery of sequence similarities between very large data sets.« less

  20. Prospective Comparison of Live Evaluation and Video Review in the Evaluation of Operator Performance in a Pediatric Emergency Airway Simulation

    PubMed Central

    House, Joseph B.; Dooley-Hash, Suzanne; Kowalenko, Terry; Sikavitsas, Athina; Seeyave, Desiree M.; Younger, John G.; Hamstra, Stanley J.; Nypaver, Michele M.

    2012-01-01

    Introduction Real-time assessment of operator performance during procedural simulation is a common practice that requires undivided attention by 1 or more reviewers, potentially over many repetitions of the same case. Objective To determine whether reviewers display better interrater agreement of procedural competency when observing recorded, rather than live, performance; and to develop an assessment tool for pediatric rapid sequence intubation (pRSI). Methods A framework of a previously established Objective Structured Assessment of Technical Skills (OSATS) tool was modified for pRSI. Emergency medicine residents (postgraduate year 1–4) were prospectively enrolled in a pRSI simulation scenario and evaluated by 2 live raters using the modified tool. Sessions were videotaped and reviewed by the same raters at least 4 months later. Raters were blinded to their initial rating. Interrater agreement was determined by using the Krippendorff generalized concordance method. Results Overall interrater agreement for live review was 0.75 (95% confidence interval [CI], 0.72–0.78) and for video was 0.79 (95% CI, 0.73–0.82). Live review was significantly superior to video review in only 1 of the OSATS domains (Preparation) and was equivalent in the other domains. Intrarater agreement between the live and video evaluation was very good, greater than 0.75 for all raters, with a mean of 0.81 (95% CI, 0.76–0.85). Conclusion The modified OSATS assessment tool demonstrated some evidence of validity in discriminating among levels of resident experience and high interreviewer reliability. With this tool, intrareviewer reliability was high between live and 4-months' delayed video review of the simulated procedure, which supports feasibility of delayed video review in resident assessment. PMID:23997874

  1. Divide and Conquer (DC) BLAST: fast and easy BLAST execution within HPC environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, Won Cheol; Cushman, John C.

    Bioinformatics is currently faced with very large-scale data sets that lead to computational jobs, especially sequence similarity searches, that can take absurdly long times to run. For example, the National Center for Biotechnology Information (NCBI) Basic Local Alignment Search Tool (BLAST and BLAST+) suite, which is by far the most widely used tool for rapid similarity searching among nucleic acid or amino acid sequences, is highly central processing unit (CPU) intensive. While the BLAST suite of programs perform searches very rapidly, they have the potential to be accelerated. In recent years, distributed computing environments have become more widely accessible andmore » used due to the increasing availability of high-performance computing (HPC) systems. Therefore, simple solutions for data parallelization are needed to expedite BLAST and other sequence analysis tools. However, existing software for parallel sequence similarity searches often requires extensive computational experience and skill on the part of the user. In order to accelerate BLAST and other sequence analysis tools, Divide and Conquer BLAST (DCBLAST) was developed to perform NCBI BLAST searches within a cluster, grid, or HPC environment by using a query sequence distribution approach. Scaling from one (1) to 256 CPU cores resulted in significant improvements in processing speed. Thus, DCBLAST dramatically accelerates the execution of BLAST searches using a simple, accessible, robust, and parallel approach. DCBLAST works across multiple nodes automatically and it overcomes the speed limitation of single-node BLAST programs. DCBLAST can be used on any HPC system, can take advantage of hundreds of nodes, and has no output limitations. Thus, this freely available tool simplifies distributed computation pipelines to facilitate the rapid discovery of sequence similarities between very large data sets.« less

  2. An Experimental Investigation of Dextrous Robots Using EVA Tools and Interfaces

    NASA Technical Reports Server (NTRS)

    Ambrose, Robert; Culbert, Christopher; Rehnmark, Frederik

    2001-01-01

    This investigation of robot capabilities with extravehicular activity (EVA) equipment looks at how improvements in dexterity are enabling robots to perform tasks once thought to be beyond machines. The approach is qualitative, using the Robonaut system at the Johnson Space Center (JSC), performing task trials that offer a quick look at this system's high degree of dexterity and the demands of EVA. Specific EVA tools attempted include tether hooks, power torque tools, and rock scoops, as well as conventional tools like scissors, wire strippers, forceps, and wrenches. More complex EVA equipment was also studied, with more complete tasks that mix tools, EVA hand rails, tethers, tools boxes, PIP pins, and EVA electrical connectors. These task trials have been ongoing over an 18 month period, as the Robonaut system evolved to its current 43 degree of freedom (DOF) configuration, soon to expand to over 50. In each case, the number of teleoperators is reported, with rough numbers of attempts and their experience level, with a subjective difficulty rating assigned to each piece of EVA equipment and function. JSC' s Robonaut system was successful with all attempted EVA hardware, suggesting new options for human and robot teams working together in space.

  3. A large-scale benchmark of gene prioritization methods.

    PubMed

    Guala, Dimitri; Sonnhammer, Erik L L

    2017-04-21

    In order to maximize the use of results from high-throughput experimental studies, e.g. GWAS, for identification and diagnostics of new disease-associated genes, it is important to have properly analyzed and benchmarked gene prioritization tools. While prospective benchmarks are underpowered to provide statistically significant results in their attempt to differentiate the performance of gene prioritization tools, a strategy for retrospective benchmarking has been missing, and new tools usually only provide internal validations. The Gene Ontology(GO) contains genes clustered around annotation terms. This intrinsic property of GO can be utilized in construction of robust benchmarks, objective to the problem domain. We demonstrate how this can be achieved for network-based gene prioritization tools, utilizing the FunCoup network. We use cross-validation and a set of appropriate performance measures to compare state-of-the-art gene prioritization algorithms: three based on network diffusion, NetRank and two implementations of Random Walk with Restart, and MaxLink that utilizes network neighborhood. Our benchmark suite provides a systematic and objective way to compare the multitude of available and future gene prioritization tools, enabling researchers to select the best gene prioritization tool for the task at hand, and helping to guide the development of more accurate methods.

  4. Screening tools to identify patients with complex health needs at risk of high use of health care services: A scoping review.

    PubMed

    Marcoux, Valérie; Chouinard, Maud-Christine; Diadiou, Fatoumata; Dufour, Isabelle; Hudon, Catherine

    2017-01-01

    Many people with chronic conditions have complex health needs often due to multiple chronic conditions, psychiatric comorbidities, psychosocial issues, or a combination of these factors. They are at high risk of frequent use of healthcare services. To offer these patients interventions adapted to their needs, it is crucial to be able to identify them early. The aim of this study was to find all existing screening tools that identify patients with complex health needs at risk of frequent use of healthcare services, and to highlight their principal characteristics. Our purpose was to find a short, valid screening tool to identify adult patients of all ages. A scoping review was performed on articles published between 1985 and July 2016, retrieved through a comprehensive search of the Scopus and CINAHL databases, following the methodological framework developed by Arksey and O'Malley (2005), and completed by Levac et al. (2010). Of the 3,818 articles identified, 30 were included, presenting 14 different screening tools. Seven tools were self-reported. Five targeted adult patients, and nine geriatric patients. Two tools were designed for specific populations. Four can be completed in 15 minutes or less. Most screening tools target elderly persons. The INTERMED self-assessment (IM-SA) targets adults of all ages and can be completed in less than 15 minutes. Future research could evaluate its usefulness as a screening tool for identifying patients with complex needs at risk of becoming high users of healthcare services.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogen, Paul Logasa; McKenzie, Amber T; Gillen, Rob

    Forensic document analysis has become an important aspect of investigation of many different kinds of crimes from money laundering to fraud and from cybercrime to smuggling. The current workflow for analysts includes powerful tools, such as Palantir and Analyst s Notebook, for moving from evidence to actionable intelligence and tools for finding documents among the millions of files on a hard disk, such as FTK. However, the analysts often leave the process of sorting through collections of seized documents to filter out the noise from the actual evidence to a highly labor-intensive manual effort. This paper presents the Redeye Analysismore » Workbench, a tool to help analysts move from manual sorting of a collection of documents to performing intelligent document triage over a digital library. We will discuss the tools and techniques we build upon in addition to an in-depth discussion of our tool and how it addresses two major use cases we observed analysts performing. Finally, we also include a new layout algorithm for radial graphs that is used to visualize clusters of documents in our system.« less

  6. Geometry and gravity influences on strength capability

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Wilmington, Robert P.; Klute, Glenn K.

    1994-01-01

    Strength, defined as the capability of an individual to produce an external force, is one of the most important determining characteristics of human performance. Knowledge of strength capabilities of a group of individuals can be applied to designing equipment and workplaces, planning procedures and tasks, and training individuals. In the manned space program, with the high risk and cost associated with spaceflight, information pertaining to human performance is important to ensuring mission success and safety. Knowledge of individual's strength capabilities in weightlessness is of interest within many areas of NASA, including workplace design, tool development, and mission planning. The weightless environment of space places the human body in a completely different context. Astronauts perform a variety of manual tasks while in orbit. Their ability to perform these tasks is partly determined by their strength capability as demanded by that particular task. Thus, an important step in task planning, development, and evaluation is to determine the ability of the humans performing it. This can be accomplished by utilizing quantitative techniques to develop a database of human strength capabilities in weightlessness. Furthermore, if strength characteristics are known, equipment and tools can be built to optimize the operators' performance. This study examined strength in performing a simple task, specifically, using a tool to apply a torque to a fixture.

  7. Lighting Studies for Fuelling Machine Deployed Visual Inspection Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoots, Carl; Griffith, George

    2015-04-01

    Under subcontract to James Fisher Nuclear, Ltd., INL has been reviewing advanced vision systems for inspection of graphite in high radiation, high temperature, and high pressure environments. INL has performed calculations and proof-of-principle measurements of optics and lighting techniques to be considered for visual inspection of graphite fuel channels in AGR reactors in UK.

  8. cp-R, an interface the R programming language for clinical laboratory method comparisons.

    PubMed

    Holmes, Daniel T

    2015-02-01

    Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Computational electronics and electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shang, C. C.

    The Computational Electronics and Electromagnetics thrust area at Lawrence Livermore National Laboratory serves as the focal point for engineering R&D activities for developing computer-based design, analysis, and tools for theory. Key representative applications include design of particle accelerator cells and beamline components; engineering analysis and design of high-power components, photonics, and optoelectronics circuit design; EMI susceptibility analysis; and antenna synthesis. The FY-96 technology-base effort focused code development on (1) accelerator design codes; (2) 3-D massively parallel, object-oriented time-domain EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; (5) 3-D spectral-domainmore » CEM tools; and (6) enhancement of laser drilling codes. Joint efforts with the Power Conversion Technologies thrust area include development of antenna systems for compact, high-performance radar, in addition to novel, compact Marx generators. 18 refs., 25 figs., 1 tab.« less

  10. Human Factors Tools for Improving Simulation Activities in Continuing Medical Education

    ERIC Educational Resources Information Center

    Seagull, F. Jacob

    2012-01-01

    Human factors (HF) is a discipline often drawn upon when there is a need to train people to perform complex, high-stakes tasks and effectively assess their performance. Complex tasks often present unique challenges for training and assessment. HF has developed specialized techniques that have been effective in overcoming several of these…

  11. Mission Possible: Measuring Critical Thinking and Problem Solving

    ERIC Educational Resources Information Center

    Wren, Doug; Cashwell, Amy

    2018-01-01

    The author describes how Virginia Beach City Public Schools developed a performance assessment that they administer to all 4th graders, 7th graders, and high school students in the district. He describes lessons learned about creating good performance tasks and developing a successful scoring process, as well as sharing tools connected to this…

  12. Making Employee Recognition a Tool for Achieving Improved Performance: Implication for Ghanaian Universities

    ERIC Educational Resources Information Center

    Amoatemaa, Abena Serwaa; Kyeremeh, Dorcas Darkoah

    2016-01-01

    Many organisations are increasingly making use of employee recognition to motivate employees to achieve high performance and productivity. Research has shown that effective recognition occurs in organisations that have strong supportive culture, understand the psychology of praising employees for their good work, and apply the principles of…

  13. Modular HPC I/O characterization with Darshan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Shane; Carns, Philip; Harms, Kevin

    2016-11-13

    Contemporary high-performance computing (HPC) applications encompass a broad range of distinct I/O strategies and are often executed on a number of different compute platforms in their lifetime. These large-scale HPC platforms employ increasingly complex I/O subsystems to provide a suitable level of I/O performance to applications. Tuning I/O workloads for such a system is nontrivial, and the results generally are not portable to other HPC systems. I/O profiling tools can help to address this challenge, but most existing tools only instrument specific components within the I/O subsystem that provide a limited perspective on I/O performance. The increasing diversity of scientificmore » applications and computing platforms calls for greater flexibililty and scope in I/O characterization.« less

  14. Social Skills Training for Children with Asperger Syndrome and High-Functioning Autism

    ERIC Educational Resources Information Center

    White, Susan Williams

    2011-01-01

    This practical, research-based guide provides a wealth of tools and strategies for implementing social skills training in school or clinical settings. Numerous case examples illustrate common social difficulties experienced by children with Asperger syndrome and high-functioning autism; the impact on peer relationships, school performance, and…

  15. Simulator validation results and proposed reporting format from flight testing a software model of a complex, high-performance airplane.

    DOT National Transportation Integrated Search

    2008-01-01

    Computer simulations are often used in aviation studies. These simulation tools may require complex, high-fidelity aircraft models. Since many of the flight models used are third-party developed products, independent validation is desired prior to im...

  16. 75 FR 25927 - Vehicle/Track Interaction Safety Standards; High-Speed and High Cant Deficiency Operations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-10

    ... qualification process as an important tool for the assessment of vehicle performance. These simulations are... qualification process, simulations would be conducted using both a measured track geometry segment... on the results of simulation studies designed to identify track geometry irregularities associated...

  17. Use Of Statistical Tools To Evaluate The Reductive Dechlorination Of High Levels Of TCE In Microcosm Studies

    EPA Science Inventory

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study ...

  18. Genomics tools available for unravelling mechanisms underlying agronomical traits in strawberry with more to come

    USDA-ARS?s Scientific Manuscript database

    In the last few years, high-throughput genomics promised to bridge the gap between plant physiology and plant sciences. In addition, high-throughput genotyping technologies facilitate marker-based selection for better performing genotypes. In strawberry, Fragaria vesca was the first reference sequen...

  19. Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo

    The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.

  20. Distributed Automated Medical Robotics to Improve Medical Field Operations

    DTIC Science & Technology

    2010-04-01

    ROBOT PATIENT INTERFACE Robotic trauma diagnosis and intervention is performed using instruments and tools mounted on the end of a robotic manipulator...manipulator to respond quickly enough to accommodate for motion due to high inertia and inaccuracies caused by low stiffness at the tool point. Ultrasonic...program was licensed to Intuitive Surgical, Inc and subsequently morphed into the daVinci surgical system. The daVinci has been widely applied in

  1. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  2. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    NASA Astrophysics Data System (ADS)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  3. Using the WHO Surgical Safety Checklist to Direct Perioperative Quality Improvement at a Surgical Hospital in Cambodia: The Importance of Objective Confirmation of Process Completion.

    PubMed

    Garland, Naomi Y; Kheng, Sokhavatey; De Leon, Michael; Eap, Hourt; Forrester, Jared A; Hay, Janice; Oum, Palritha; Sam Ath, Socheat; Stock, Simon; Yem, Samprathna; Lucas, Gerlinda; Weiser, Thomas G

    2017-12-01

    The WHO surgical safety checklist (SSC) is known to prevent postoperative complications; however, strategies for effective implementation are unclear. In addition to cultural and organizational barriers faced by high-income countries, resource-constrained settings face scarcity of durable and consumable goods. We used the SSC to better understand barriers to improvement at a trauma hospital in Battambang, Cambodia. We introduced the SSC and trained data collectors to observe surgical staff performing the checklist. Members of the research team observed cases and data collection. After 3 months, we modified the data collection tool to focus on infection prevention and elicit more accurate responses. Over 16 months we recorded data on 695 operations (304 cases using the first tool and 391 cases with the modified tool). The first tool identified five items as being in high compliance, which were then excluded from further assessment. Two items-instrument sterility confirmation and sponge counting-were identified as being misinterpreted by the data collectors' tool. These items were reworded to capture objective assessment of task completion. Confirmation of instrument sterility was initially never performed but rectified to >95% compliance; sponge counting and prophylactic antibiotic administration were consistently underperformed. Staff complied with communication elements of the SSC and quickly adopted process improvements. The wording of our data collection tool affected interpretation of compliance with standards. Material resources are not the primary barrier to checklist implementation in this setting, and future work should focus on clarification of protocols and objective confirmation of tasks.

  4. Dimensional changes of Nb 3Sn Rutherford cables during heat treatment

    DOE PAGES

    Rochepault, E.; Ferracin, P.; Ambrosio, G.; ...

    2016-06-01

    In high field magnet applications, Nb 3Sn coils undergo a heat treatment step after winding. During this stage, coils radially expand and longitudinally contract due to the Nb 3Sn phase change. In order to prevent residual strain from altering superconducting performances, the tooling must provide the adequate space for these dimensional changes. The aim of this paper is to understand the behavior of cable dimensions during heat treatment and to provide estimates of the space to be accommodated in the tooling for coil expansion and contraction. In addition, this paper summarizes measurements of dimensional changes on strands, single Rutherford cables,more » cable stacks, and coils performed between 2013 and 2015. These samples and coils have been performed within a collaboration between CERN and the U.S. LHC Accelerator Research Program to develop Nb 3Sn quadrupole magnets for the HiLumi LHC. The results are also compared with other high field magnet projects.« less

  5. Short, self-report voice symptom scales: psychometric characteristics of the voice handicap index-10 and the vocal performance questionnaire.

    PubMed

    Deary, Ian J; Webb, Alison; Mackenzie, Kenneth; Wilson, Janet A; Carding, Paul N

    2004-09-01

    Short, self-report symptom questionnaires are useful in routine clinical situations for assessing the progress of disorders and the influence of interventions. The Voice Handicap Index-10 (VHI-10) and Vocal Performance Questionnaire (VPQ) are brief self-reported assessments of voice pathology, apparently useful in the general voice clinic population. Little is known of the structure or internal consistency of either tool, nor whether they correlate. This study carried out a substantial, systematic evaluation of their performance in the Laryngology office setting. 330 adult (222 women, 108 men) voice clinic attenders completed the VHI and the VPQ. The VHI-10 and VPQ each had a large, single principal component, high internal consistency, and were highly correlated (disattenuated r=0.91). The VHI-10 and the VPQ are similar, short, convenient, internally-consistent, unidimensional tools. The total VHI-10 or VPQ score is a good overall indicator of the severity of voice disorders.

  6. Mediator Effect of TPM between TQM and Business Performance in Malaysia Automotive Industry

    NASA Astrophysics Data System (ADS)

    Ahmad, M. F.; Zakuan, N.; Rasi, Raja Zuraidah R. M.; Hisyamudin, M. N. N.

    2015-05-01

    Total Quality Management (TQM) is vital management tool in ensuring a company can success in the continuously growing competition in the global market. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. However, only few previous studies have examined the mediators and moderators between TQM and business performance. This present research proposed a TQM performance model with mediator effect of TPM with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.3 per cent rate. The result concludes that TPM is partial mediation between and TQM and Business Performance with indirect effect (IE) is 0.25 which can be categorised as high mediator effect.

  7. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  8. Score Reliability and Construct Validity of the Flinn Performance Screening Tool for Adults With Symptoms of Carpal Tunnel Syndrome

    PubMed Central

    Flinn, Sharon R.; Pease, William S.; Freimer, Miriam L.

    2013-01-01

    OBJECTIVE We investigated the psychometric properties of the Flinn Performance Screening Tool (FPST) for people referred with symptoms of carpal tunnel syndrome (CTS). METHOD An occupational therapist collected data from 46 participants who completed the Functional Status Scale (FSS) and FPST after the participants’ nerve conduction velocity study to test convergent and contrasted-group validity. RESULTS Seventy-four percent of the participants had abnormal nerve conduction studies. Cronbach’s α coefficients for subscale and total scores of the FPST ranged from .96 to .98. Intrarater reliability for six shared items of the FSS and the FPST was supported by high agreement (71%) and a fair κ statistic (.36). Strong to moderate positive relationships were found between the FSS and FPST scores. Functional status differed significantly among severe, mild, and negative CTS severity groups. CONCLUSION The FPST shows adequate psychometric properties as a client-centered screening tool for occupational performance of people referred for symptoms of CTS. PMID:22549598

  9. A study of the relationship between the performance and dependability of a fault-tolerant computer

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.

    1994-01-01

    This thesis studies the relationship by creating a tool (FTAPE) that integrates a high stress workload generator with fault injection and by using the tool to evaluate system performance under error conditions. The workloads are comprised of processes which are formed from atomic components that represent CPU, memory, and I/O activity. The fault injector is software-implemented and is capable of injecting any memory addressable location, including special registers and caches. This tool has been used to study a Tandem Integrity S2 Computer. Workloads with varying numbers of processes and varying compositions of CPU, memory, and I/O activity are first characterized in terms of performance. Then faults are injected into these workloads. The results show that as the number of concurrent processes increases, the mean fault latency initially increases due to increased contention for the CPU. However, for even higher numbers of processes (less than 3 processes), the mean latency decreases because long latency faults are paged out before they can be activated.

  10. Acoustic emission as a screening tool for ceramic matrix composites

    NASA Astrophysics Data System (ADS)

    Ojard, Greg; Goberman, Dan; Holowczak, John

    2017-02-01

    Ceramic matrix composites are composite materials with ceramic fibers in a high temperature matrix of ceramic or glass-ceramic. This emerging class of materials is viewed as enabling for efficiency improvements in many energy conversion systems. The key controlling property of ceramic matrix composites is a relatively weak interface between the matrix and the fiber that aids crack deflection and fiber pullout resulting in greatly increased toughness over monolithic ceramics. United Technologies Research Center has been investigating glass-ceramic composite systems as a tool to understand processing effects on material performance related to the performance of the weak interface. Changes in the interface have been shown to affect the mechanical performance observed in flexural testing and subsequent microstructural investigations have confirmed the performance (or lack thereof) of the interface coating. Recently, the addition of acoustic emission testing during flexural testing has aided the understanding of the characteristics of the interface and its performance. The acoustic emission onset stress changes with strength and toughness and this could be a quality tool in screening the material before further development and use. The results of testing and analysis will be shown and additional material from other ceramic matrix composite systems may be included to show trends.

  11. Supercomputing '91; Proceedings of the 4th Annual Conference on High Performance Computing, Albuquerque, NM, Nov. 18-22, 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Various papers on supercomputing are presented. The general topics addressed include: program analysis/data dependence, memory access, distributed memory code generation, numerical algorithms, supercomputer benchmarks, latency tolerance, parallel programming, applications, processor design, networks, performance tools, mapping and scheduling, characterization affecting performance, parallelism packaging, computing climate change, combinatorial algorithms, hardware and software performance issues, system issues. (No individual items are abstracted in this volume)

  12. High Performance Computing Software Applications for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  13. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis.

    PubMed

    Kuleesha, Yadav; Puah, Wee Choo; Lin, Feng; Wasser, Martin

    2014-01-01

    During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis. We designed a new tool to visualize and quantify morphological changes of muscles in time-lapse images of Drosophila metamorphosis. Our in vivo imaging experiments revealed that evolutionarily conserved genes involved in Tor signalling and autophagy, perform similar functions in regulating muscle mass in mammals and Drosophila. Extending our approach to a genome-wide scale has the potential to identify new genes involved in muscle size regulation.

  14. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis

    PubMed Central

    2014-01-01

    Background During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. Results We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis. Conclusions We designed a new tool to visualize and quantify morphological changes of muscles in time-lapse images of Drosophila metamorphosis. Our in vivo imaging experiments revealed that evolutionarily conserved genes involved in Tor signalling and autophagy, perform similar functions in regulating muscle mass in mammals and Drosophila. Extending our approach to a genome-wide scale has the potential to identify new genes involved in muscle size regulation. PMID:25521203

  15. Performance evaluation of the Engineering Analysis and Data Systems (EADS) 2

    NASA Technical Reports Server (NTRS)

    Debrunner, Linda S.

    1994-01-01

    The Engineering Analysis and Data System (EADS)II (1) was installed in March 1993 to provide high performance computing for science and engineering at Marshall Space Flight Center (MSFC). EADS II increased the computing capabilities over the existing EADS facility in the areas of throughput and mass storage. EADS II includes a Vector Processor Compute System (VPCS), a Virtual Memory Compute System (CFS), a Common Output System (COS), as well as Image Processing Station, Mini Super Computers, and Intelligent Workstations. These facilities are interconnected by a sophisticated network system. This work considers only the performance of the VPCS and the CFS. The VPCS is a Cray YMP. The CFS is implemented on an RS 6000 using the UniTree Mass Storage System. To better meet the science and engineering computing requirements, EADS II must be monitored, its performance analyzed, and appropriate modifications for performance improvement made. Implementing this approach requires tool(s) to assist in performance monitoring and analysis. In Spring 1994, PerfStat 2.0 was purchased to meet these needs for the VPCS and the CFS. PerfStat(2) is a set of tools that can be used to analyze both historical and real-time performance data. Its flexible design allows significant user customization. The user identifies what data is collected, how it is classified, and how it is displayed for evaluation. Both graphical and tabular displays are supported. The capability of the PerfStat tool was evaluated, appropriate modifications to EADS II to optimize throughput and enhance productivity were suggested and implemented, and the effects of these modifications on the systems performance were observed. In this paper, the PerfStat tool is described, then its use with EADS II is outlined briefly. Next, the evaluation of the VPCS, as well as the modifications made to the system are described. Finally, conclusions are drawn and recommendations for future worked are outlined.

  16. Numerical analysis of thermal drilling technique on titanium sheet metal

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Hynes, N. Rajesh Jesudoss

    2018-05-01

    Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.

  17. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.

  18. Excimer laser decoating of chromium titanium aluminium nitride to facilitate re-use of cutting tools

    NASA Astrophysics Data System (ADS)

    Sundar, M.; Whitehead, D.; Mativenga, P. T.; Li, L.; Cooke, K. E.

    2009-11-01

    This work reports on the technical feasibility and establishment of a process window for removing chromium titanium aluminium nitride (CrTiAlN) coating from steel substrates by laser irradiation. CrTiAlN coating has high hardness and oxidation resistance, with applications for use with cutting tools. The motivation for removing such coatings is to facilitate re-use of tooling by enabling regrinding or reshaping of a worn tool and hence promote sustainable material usage. In this work, laser decoating was performed using an excimer laser. The effect of laser fluence, number of pulses, frequency, scanning speed and laser beam overlap on the decoating performance was investigated in detail. The minimum threshold laser fluence for removing the CrTiAlN coating was lower than that of the steel substrate and this factor is beneficial in controlling the decoating process. Successful laser removal of CrTiAlN coating without noticeable damage to the steel substrate was demonstrated.

  19. Using Galaxy to Perform Large-Scale Interactive Data Analyses

    PubMed Central

    Hillman-Jackson, Jennifer; Clements, Dave; Blankenberg, Daniel; Taylor, James; Nekrutenko, Anton

    2014-01-01

    Innovations in biomedical research technologies continue to provide experimental biologists with novel and increasingly large genomic and high-throughput data resources to be analyzed. As creating and obtaining data has become easier, the key decision faced by many researchers is a practical one: where and how should an analysis be performed? Datasets are large and analysis tool set-up and use is riddled with complexities outside of the scope of core research activities. The authors believe that Galaxy provides a powerful solution that simplifies data acquisition and analysis in an intuitive Web application, granting all researchers access to key informatics tools previously only available to computational specialists working in Unix-based environments. We will demonstrate through a series of biomedically relevant protocols how Galaxy specifically brings together (1) data retrieval from public and private sources, for example, UCSC's Eukaryote and Microbial Genome Browsers, (2) custom tools (wrapped Unix functions, format standardization/conversions, interval operations), and 3rd-party analysis tools. PMID:22700312

  20. Effect of micro-scale texturing on the cutting tool performance

    NASA Astrophysics Data System (ADS)

    Vasumathy, D.; Meena, Anil

    2018-05-01

    The present study is mainly focused on the cutting performance of the micro-scale textured carbide tools while turning AISI 304 austenitic stainless steel under dry cutting environment. The texture on the rake face of the carbide tools was fabricated by laser machining. The cutting performance of the textured tools was further compared with conventional tools in terms of cutting forces, tool wear, machined surface quality and chip curl radius. SEM and EDS analyses have been also performed to better understand the tool surface characteristics. Results show that the grooves help in breaking the tool-chip contact leading to a lesser tool-chip contact area which results in reduced iron (Fe) adhesion to the tool.

  1. An efficient framework for Java data processing systems in HPC environments

    NASA Astrophysics Data System (ADS)

    Fries, Aidan; Castañeda, Javier; Isasi, Yago; Taboada, Guillermo L.; Portell de Mora, Jordi; Sirvent, Raül

    2011-11-01

    Java is a commonly used programming language, although its use in High Performance Computing (HPC) remains relatively low. One of the reasons is a lack of libraries offering specific HPC functions to Java applications. In this paper we present a Java-based framework, called DpcbTools, designed to provide a set of functions that fill this gap. It includes a set of efficient data communication functions based on message-passing, thus providing, when a low latency network such as Myrinet is available, higher throughputs and lower latencies than standard solutions used by Java. DpcbTools also includes routines for the launching, monitoring and management of Java applications on several computing nodes by making use of JMX to communicate with remote Java VMs. The Gaia Data Processing and Analysis Consortium (DPAC) is a real case where scientific data from the ESA Gaia astrometric satellite will be entirely processed using Java. In this paper we describe the main elements of DPAC and its usage of the DpcbTools framework. We also assess the usefulness and performance of DpcbTools through its performance evaluation and the analysis of its impact on some DPAC systems deployed in the MareNostrum supercomputer (Barcelona Supercomputing Center).

  2. Energy-Saving Melting and Revert Reduction Technology (E-SMARRT): Use of Laser Engineered Net Shaping for Rapid Manufacturing of Dies with Protective Coatings and Improved Thermal Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brevick, Jerald R.

    2014-06-13

    In the high pressure die casting process, molten metal is introduced into a die cavity at high pressure and velocity, enabling castings of thin wall section and complex geometry to be obtained. Traditional die materials have been hot work die steels, commonly H13. Manufacture of the dies involves machining the desired geometry from monolithic blocks of annealed tool steel, heat treating to desired hardness and toughness, and final machining, grinding and polishing. The die is fabricated with internal water cooling passages created by drilling. These materials and fabrication methods have been used for many years, however, there are limitations. Toolmore » steels have relatively low thermal conductivity, and as a result, it takes time to remove the heat from the tool steel via the drilled internal water cooling passages. Furthermore, the low thermal conductivity generates large thermal gradients at the die cavity surfaces, which ultimately leads to thermal fatigue cracking on the surfaces of the die steel. The high die surface temperatures also promote the metallurgical bonding of the aluminum casting alloy to the surface of the die steel (soldering). In terms of process efficiency, these tooling limitations reduce the number of die castings that can be made per unit time by increasing cycle time required for cooling, and increasing downtime and cost to replace tooling which has failed either by soldering or by thermal fatigue cracking (heat checking). The objective of this research was to evaluate the feasibility of designing, fabricating, and testing high pressure die casting tooling having properties equivalent to H13 on the surface in contact with molten casting alloy - for high temperature and high velocity molten metal erosion resistance – but with the ability to conduct heat rapidly to interior water cooling passages. A layered bimetallic tool design was selected, and the design evaluated for thermal and mechanical performance via finite element analysis. H13 was retained as the exterior layer of the tooling, while commercially pure copper was chosen for the interior structure of the tooling. The tooling was fabricated by traditional machining of the copper substrate, and H13 powder was deposited on the copper via the Laser Engineered Net Shape (LENSTM) process. The H13 deposition layer was then final machined by traditional methods. Two tooling components were designed and fabricated; a thermal fatigue test specimen, and a core for a commercial aluminum high pressure die casting tool. The bimetallic thermal fatigue specimen demonstrated promising performance during testing, and the test results were used to improve the design and LENS TM deposition methods for subsequent manufacture of the commercial core. Results of the thermal finite element analysis for the thermal fatigue test specimen indicate that it has the ability to lose heat to the internal water cooling passages, and to external spray cooling, significantly faster than a monolithic H13 thermal fatigue sample. The commercial core is currently in the final stages of fabrication, and will be evaluated in an actual production environment at Shiloh Die casting. In this research, the feasibility of designing and fabricating copper/H13 bimetallic die casting tooling via LENS TM processing, for the purpose of improving die casting process efficiency, is demonstrated.« less

  3. Assessment of competence in video-assisted thoracoscopic surgery lobectomy: A Danish nationwide study.

    PubMed

    Petersen, René Horsleben; Gjeraa, Kirsten; Jensen, Katrine; Møller, Lars Borgbjerg; Hansen, Henrik Jessen; Konge, Lars

    2018-04-18

    Competence in video-assisted thoracoscopic surgery lobectomy has previously been established on the basis of numbers of procedures performed, but this approach does not ensure competence. Specific assessment tools, such as the newly developed video-assisted thoracoscopic surgery lobectomy assessment tool, allow for structured and objective assessment of competence. Our aim was to provide validity evidence for the video-assisted thoracoscopic surgery lobectomy assessment tool. Video recordings of 60 video-assisted thoracoscopic surgery lobectomies performed by 18 thoracic surgeons were rated using the video-assisted thoracoscopic surgery lobectomy assessment tool. All 4 centers of thoracic surgery in Denmark participated in the study. Two video-assisted thoracoscopic surgery experts rated the videos. They were blinded to surgeon and center. The total internal consistency reliability Cronbach's alpha was 0.93. Inter-rater reliability between the 2 raters was Pearson's r = 0.71 (P < .001). The mean video-assisted thoracoscopic surgery lobectomy assessment tool scores for the 10 procedures performed by beginners were 22.1 (standard deviation [SD], 8.6) for the 28 procedures performed by the intermediate surgeons, 31.2 (SD, 4.4), and for the 20 procedures performed by experts 35.9 (SD, 2.9) (P < .001). Bonferroni post hoc tests showed that experts were significantly better than intermediates (P < .008) and beginners (P < .001). Intermediates' mean scores were significantly better than beginners (P < .001). The pass/fail standard calculated using the contrasting group's method was 31 points. One of the beginners passed, and 2 experts failed the test. Validity evidence was provided for a newly developed assessment tool for video-assisted thoracoscopic surgery lobectomy (video-assisted thoracoscopic surgery lobectomy assessment tool) in a clinical setting. The discriminatory ability among expert surgeons, intermediate surgeons, and beginners proved highly significant. The video-assisted thoracoscopic surgery lobectomy assessment tool could be an important aid in the future training and certification of thoracic surgeons. Copyright © 2018 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  4. Global Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnamoorthy, Sriram; Daily, Jeffrey A.; Vishnu, Abhinav

    2015-11-01

    Global Arrays (GA) is a distributed-memory programming model that allows for shared-memory-style programming combined with one-sided communication, to create a set of tools that combine high performance with ease-of-use. GA exposes a relatively straightforward programming abstraction, while supporting fully-distributed data structures, locality of reference, and high-performance communication. GA was originally formulated in the early 1990’s to provide a communication layer for the Northwest Chemistry (NWChem) suite of chemistry modeling codes that was being developed concurrently.

  5. Interactions Between Structure and Processing that Control Moisture Uptake in High-Performance Polycyanurates (Briefing Charts)

    DTIC Science & Technology

    2015-03-24

    distribution is unlimited.  . Interactions Between Structure and Processing that Control Moisture Uptake in High-Performance Polycyanurates Presenter: Dr...Edwards AFB, CA 4 California State University, Long Beach, CA 90840 2 Outline: Basic Studies of Moisture Uptake in Cyanate Ester Networks • Background...Motivation • SOTA Theories of Moisture Uptake in Thermosetting Networks • New Tools and New Discoveries • Unresolved Issues and Ways to Address Them

  6. orthAgogue: an agile tool for the rapid prediction of orthology relations.

    PubMed

    Ekseth, Ole Kristian; Kuiper, Martin; Mironov, Vladimir

    2014-03-01

    The comparison of genes and gene products across species depends on high-quality tools to determine the relationships between gene or protein sequences from various species. Although some excellent applications are available and widely used, their performance leaves room for improvement. We developed orthAgogue: a multithreaded C application for high-speed estimation of homology relations in massive datasets, operated via a flexible and easy command-line interface. The orthAgogue software is distributed under the GNU license. The source code and binaries compiled for Linux are available at https://code.google.com/p/orthagogue/.

  7. An HTML Tool for Production of Interactive Stereoscopic Compositions.

    PubMed

    Chistyakov, Alexey; Soto, Maria Teresa; Martí, Enric; Carrabina, Jordi

    2016-12-01

    The benefits of stereoscopic vision in medical applications were appreciated and have been thoroughly studied for more than a century. The usage of the stereoscopic displays has a proven positive impact on performance in various medical tasks. At the same time the market of 3D-enabled technologies is blooming. New high resolution stereo cameras, TVs, projectors, monitors, and head mounted displays become available. This equipment, completed with a corresponding application program interface (API), could be relatively easy implemented in a system. Such complexes could open new possibilities for medical applications exploiting the stereoscopic depth. This work proposes a tool for production of interactive stereoscopic graphical user interfaces, which could represent a software layer for web-based medical systems facilitating the stereoscopic effect. Further the tool's operation mode and the results of the conducted subjective and objective performance tests will be exposed.

  8. Impact and Penetration Simulations for Composite Wing-like Structures

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.

  9. Cognitive ergonomics of operational tools

    NASA Astrophysics Data System (ADS)

    Lüdeke, A.

    2012-10-01

    Control systems have become increasingly more powerful over the past decades. The availability of high data throughput and sophisticated graphical interactions has opened a variety of new possibilities. But has this helped to provide intuitive, easy to use applications to simplify the operation of modern large scale accelerator facilities? We will discuss what makes an application useful to operation and what is necessary to make a tool easy to use. We will show that even the implementation of a small number of simple application design rules can help to create ergonomic operational tools. The author is convinced that such tools do indeed help to achieve higher beam availability and better beam performance at accelerator facilities.

  10. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  11. GAC: Gene Associations with Clinical, a web based application

    PubMed Central

    Zhang, Xinyan; Rupji, Manali; Kowalski, Jeanne

    2018-01-01

    We present GAC, a shiny R based tool for interactive visualization of clinical associations based on high-dimensional data. The tool provides a web-based suite to perform supervised principal component analysis (SuperPC), an approach that uses both high-dimensional data, such as gene expression, combined with clinical data to infer clinical associations. We extended the approach to address binary outcomes, in addition to continuous and time-to-event data in our package, thereby increasing the use and flexibility of SuperPC.  Additionally, the tool provides an interactive visualization for summarizing results based on a forest plot for both binary and time-to-event data.  In summary, the GAC suite of tools provide a one stop shop for conducting statistical analysis to identify and visualize the association between a clinical outcome of interest and high-dimensional data types, such as genomic data. Our GAC package has been implemented in R and is available via http://shinygispa.winship.emory.edu/GAC/. The developmental repository is available at https://github.com/manalirupji/GAC. PMID:29263780

  12. Progress in development of coated indexable cemented carbide inserts for machining of iron based work piece materials

    NASA Astrophysics Data System (ADS)

    Czettl, C.; Pohler, M.

    2016-03-01

    Increasing demands on material properties of iron based work piece materials, e.g. for the turbine industry, complicate the machining process and reduce the lifetime of the cutting tools. Therefore, improved tool solutions, adapted to the requirements of the desired application have to be developed. Especially, the interplay of macro- and micro geometry, substrate material, coating and post treatment processes is crucial for the durability of modern high performance tool solutions. Improved and novel analytical methods allow a detailed understanding of material properties responsible for the wear behaviour of the tools. Those support the knowledge based development of tailored cutting materials for selected applications. One important factor for such a solution is the proper choice of coating material, which can be synthesized by physical or chemical vapor deposition techniques. Within this work an overview of state-of-the-art coated carbide grades is presented and application examples are shown to demonstrate their high efficiency. Machining processes for a material range from cast iron, low carbon steels to high alloyed steels are covered.

  13. A Systematic Review of Tools Used to Assess Team Leadership in Health Care Action Teams.

    PubMed

    Rosenman, Elizabeth D; Ilgen, Jonathan S; Shandro, Jamie R; Harper, Amy L; Fernandez, Rosemarie

    2015-10-01

    To summarize the characteristics of tools used to assess leadership in health care action (HCA) teams. HCA teams are interdisciplinary teams performing complex, critical tasks under high-pressure conditions. The authors conducted a systematic review of the PubMed/MEDLINE, CINAHL, ERIC, EMBASE, PsycINFO, and Web of Science databases, key journals, and review articles published through March 2012 for English-language articles that applied leadership assessment tools to HCA teams in all specialties. Pairs of reviewers assessed identified articles for inclusion and exclusion criteria and abstracted data on study characteristics, tool characteristics, and validity evidence. Of the 9,913 abstracts screened, 83 studies were included. They described 61 team leadership assessment tools. Forty-nine tools (80%) provided behaviors, skills, or characteristics to define leadership. Forty-four tools (72%) assessed leadership as one component of a larger assessment, 13 tools (21%) identified leadership as the primary focus of the assessment, and 4 (7%) assessed leadership style. Fifty-three studies (64%) assessed leadership at the team level; 29 (35%) did so at the individual level. Assessments of simulated (n = 55) and live (n = 30) patient care events were performed. Validity evidence included content validity (n = 75), internal structure (n = 61), relationship to other variables (n = 44), and response process (n = 15). Leadership assessment tools applied to HCA teams are heterogeneous in content and application. Comparisons between tools are limited by study variability. A systematic approach to team leadership tool development, evaluation, and implementation will strengthen understanding of this important competency.

  14. Reliable and valid assessment of Lichtenstein hernia repair skills.

    PubMed

    Carlsen, C G; Lindorff-Larsen, K; Funch-Jensen, P; Lund, L; Charles, P; Konge, L

    2014-08-01

    Lichtenstein hernia repair is a common surgical procedure and one of the first procedures performed by a surgical trainee. However, formal assessment tools developed for this procedure are few and sparsely validated. The aim of this study was to determine the reliability and validity of an assessment tool designed to measure surgical skills in Lichtenstein hernia repair. Key issues were identified through a focus group interview. On this basis, an assessment tool with eight items was designed. Ten surgeons and surgical trainees were video recorded while performing Lichtenstein hernia repair, (four experts, three intermediates, and three novices). The videos were blindly and individually assessed by three raters (surgical consultants) using the assessment tool. Based on these assessments, validity and reliability were explored. The internal consistency of the items was high (Cronbach's alpha = 0.97). The inter-rater reliability was very good with an intra-class correlation coefficient (ICC) = 0.93. Generalizability analysis showed a coefficient above 0.8 even with one rater. The coefficient improved to 0.92 if three raters were used. One-way analysis of variance found a significant difference between the three groups which indicates construct validity, p < 0.001. Lichtenstein hernia repair skills can be assessed blindly by a single rater in a reliable and valid fashion with the new procedure-specific assessment tool. We recommend this tool for future assessment of trainees performing Lichtenstein hernia repair to ensure that the objectives of competency-based surgical training are met.

  15. HRLSim: a high performance spiking neural network simulator for GPGPU clusters.

    PubMed

    Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan

    2014-02-01

    Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.

  16. Improvement of Self-regulated Learning in Mathematics through a Hypermedia Application: Differences based on Academic Performance and Previous Knowledge.

    PubMed

    Cueli, Marisol; Rodríguez, Celestino; Areces, Débora; García, Trinidad; González-Castro, Paloma

    2017-12-04

    Self-regulation on behalf of the student is crucial in learning Mathematics through hypermedia applications and is an even greater challenge in these IT environments. Two aims are formulated. First, to analyze the effectiveness of a hypermedia tool in improving perceived knowledge of self-regulatory strategies and the perceived usage of the planning, executing and assessment strategy on behalf of students with low, medium and high levels of academic performance. Second, to analyze the effectiveness of the hypermedia tool in improving perceived usage of the strategy for planning, monitoring and evaluating on behalf of students with a perceived knowledge (low, medium and high). Participants were 624 students (aged 10-13), classified into a treatment group (TG; 391) and a comparative group (CG; 233). They completed a questionnaire on perceived knowledge (Perceived Knowledge of Self-Regulatory Strategies) and another one on perceived usage of the strategy for planning, performing and evaluating (Inventory of Self-regulatory Learning Processes). Univariate covariance analyses (ANCOVAs) and Student-t tests were used. ANCOVA results were not statistically significant. However, the linear contrast indicated a significant improvement in perceived knowledge of strategies among the TG with low, medium and high academic performance (p ≤ .001). Results are discussed in the light of past and future research.

  17. An automated protocol for performance benchmarking a widefield fluorescence microscope.

    PubMed

    Halter, Michael; Bier, Elianna; DeRose, Paul C; Cooksey, Gregory A; Choquette, Steven J; Plant, Anne L; Elliott, John T

    2014-11-01

    Widefield fluorescence microscopy is a highly used tool for visually assessing biological samples and for quantifying cell responses. Despite its widespread use in high content analysis and other imaging applications, few published methods exist for evaluating and benchmarking the analytical performance of a microscope. Easy-to-use benchmarking methods would facilitate the use of fluorescence imaging as a quantitative analytical tool in research applications, and would aid the determination of instrumental method validation for commercial product development applications. We describe and evaluate an automated method to characterize a fluorescence imaging system's performance by benchmarking the detection threshold, saturation, and linear dynamic range to a reference material. The benchmarking procedure is demonstrated using two different materials as the reference material, uranyl-ion-doped glass and Schott 475 GG filter glass. Both are suitable candidate reference materials that are homogeneously fluorescent and highly photostable, and the Schott 475 GG filter glass is currently commercially available. In addition to benchmarking the analytical performance, we also demonstrate that the reference materials provide for accurate day to day intensity calibration. Published 2014 Wiley Periodicals Inc. Published 2014 Wiley Periodicals Inc. This article is a US government work and, as such, is in the public domain in the United States of America.

  18. Sports teams as superorganisms: implications of sociobiological models of behaviour for research and practice in team sports performance analysis.

    PubMed

    Duarte, Ricardo; Araújo, Duarte; Correia, Vanda; Davids, Keith

    2012-08-01

    Significant criticisms have emerged on the way that collective behaviours in team sports have been traditionally evaluated. A major recommendation has been for future research and practice to focus on the interpersonal relationships developed between team players during performance. Most research has typically investigated team game performance in subunits (attack or defence), rather than considering the interactions of performers within the whole team. In this paper, we offer the view that team performance analysis could benefit from the adoption of biological models used to explain how repeated interactions between grouping individuals scale to emergent social collective behaviours. We highlight the advantages of conceptualizing sports teams as functional integrated 'super-organisms' and discuss innovative measurement tools, which might be used to capture the superorganismic properties of sports teams. These tools are suitable for revealing the idiosyncratic collective behaviours underlying the cooperative and competitive tendencies of different sports teams, particularly their coordination of labour and the most frequent channels of communication and patterns of interaction between team players. The principles and tools presented here can serve as the basis for novel approaches and applications of performance analysis devoted to understanding sports teams as cohesive, functioning, high-order organisms exhibiting their own peculiar behavioural patterns.

  19. 2-D Circulation Control Airfoil Benchmark Experiments Intended for CFD Code Validation

    NASA Technical Reports Server (NTRS)

    Englar, Robert J.; Jones, Gregory S.; Allan, Brian G.; Lin, Johb C.

    2009-01-01

    A current NASA Research Announcement (NRA) project being conducted by Georgia Tech Research Institute (GTRI) personnel and NASA collaborators includes the development of Circulation Control (CC) blown airfoils to improve subsonic aircraft high-lift and cruise performance. The emphasis of this program is the development of CC active flow control concepts for both high-lift augmentation, drag control, and cruise efficiency. A collaboration in this project includes work by NASA research engineers, whereas CFD validation and flow physics experimental research are part of NASA s systematic approach to developing design and optimization tools for CC applications to fixed-wing aircraft. The design space for CESTOL type aircraft is focusing on geometries that depend on advanced flow control technologies that include Circulation Control aerodynamics. The ability to consistently predict advanced aircraft performance requires improvements in design tools to include these advanced concepts. Validation of these tools will be based on experimental methods applied to complex flows that go beyond conventional aircraft modeling techniques. This paper focuses on recent/ongoing benchmark high-lift experiments and CFD efforts intended to provide 2-D CFD validation data sets related to NASA s Cruise Efficient Short Take Off and Landing (CESTOL) study. Both the experimental data and related CFD predictions are discussed.

  20. Warp-X: A new exascale computing platform for beam–plasma simulations

    DOE PAGES

    Vay, J. -L.; Almgren, A.; Bell, J.; ...

    2018-01-31

    Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less

  1. A simple tool for tubing modification to improve spiral high-speed counter-current chromatography for protein purification

    PubMed Central

    Ito, Yoichiro; Ma, Xiaofeng; Clary, Robert

    2016-01-01

    A simple tool is introduced which can modify the shape of tubing to enhance the partition efficiency in high-speed countercurrent chromatography. It consists of a pair of interlocking identical gears, each coaxially holding a pressing wheel to intermittently compress plastic tubing in 0 – 10 mm length at every 1 cm interval. The performance of the processed tubing is examined in protein separation with 1.6 mm ID PTFE tubing intermittently pressed in 3 mm and 10 mm width both at 10 mm intervals at various flow rates and revolution speeds. A series of experiments was performed with a polymer phase system composed of polyethylene glycol and dibasic potassium phosphate each at 12.5% (w/w) in deionized water using three protein samples. Overall results clearly demonstrate that the compressed tubing can yield substantially higher peak resolution than the non-processed tubing. The simple tubing modifier is very useful for separation of proteins with high-speed countercurrent chromatography. PMID:27818942

  2. Warp-X: A new exascale computing platform for beam–plasma simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vay, J. -L.; Almgren, A.; Bell, J.

    Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less

  3. A simple tool for tubing modification to improve spiral high-speed counter-current chromatography for protein purification.

    PubMed

    Ito, Yoichiro; Ma, Xiaofeng; Clary, Robert

    2016-01-01

    A simple tool is introduced which can modify the shape of tubing to enhance the partition efficiency in high-speed countercurrent chromatography. It consists of a pair of interlocking identical gears, each coaxially holding a pressing wheel to intermittently compress plastic tubing in 0 - 10 mm length at every 1 cm interval. The performance of the processed tubing is examined in protein separation with 1.6 mm ID PTFE tubing intermittently pressed in 3 mm and 10 mm width both at 10 mm intervals at various flow rates and revolution speeds. A series of experiments was performed with a polymer phase system composed of polyethylene glycol and dibasic potassium phosphate each at 12.5% (w/w) in deionized water using three protein samples. Overall results clearly demonstrate that the compressed tubing can yield substantially higher peak resolution than the non-processed tubing. The simple tubing modifier is very useful for separation of proteins with high-speed countercurrent chromatography.

  4. bioalcidae, samjs and vcffilterjs: object-oriented formatters and filters for bioinformatics files.

    PubMed

    Lindenbaum, Pierre; Redon, Richard

    2018-04-01

    Reformatting and filtering bioinformatics files are common tasks for bioinformaticians. Standard Linux tools and specific programs are usually used to perform such tasks but there is still a gap between using these tools and the programming interface of some existing libraries. In this study, we developed a set of tools namely bioalcidae, samjs and vcffilterjs that reformat or filter files using a JavaScript engine or a pure java expression and taking advantage of the java API for high-throughput sequencing data (htsjdk). https://github.com/lindenb/jvarkit. pierre.lindenbaum@univ-nantes.fr.

  5. Modeling and simulation of continuous wave velocity radar based on third-order DPLL

    NASA Astrophysics Data System (ADS)

    Di, Yan; Zhu, Chen; Hong, Ma

    2015-02-01

    Second-order digital phase-locked-loop (DPLL) is widely used in traditional Continuous wave (CW) velocity radar with poor performance in high dynamic conditions. Using the third-order DPLL can improve the performance. Firstly, the echo signal model of CW radar is given. Secondly, theoretical derivations of the tracking performance in different velocity conditions are given. Finally, simulation model of CW radar is established based on Simulink tool. Tracking performance of the two kinds of DPLL in different acceleration and jerk conditions is studied by this model. The results show that third-order PLL has better performance in high dynamic conditions. This model provides a platform for further research of CW radar.

  6. Cognitive Readiness Assessment and Reporting: An Open Source Mobile Framework for Operational Decision Support and Performance Improvement

    ERIC Educational Resources Information Center

    Heric, Matthew; Carter, Jenn

    2011-01-01

    Cognitive readiness (CR) and performance for operational time-critical environments are continuing points of focus for military and academic communities. In response to this need, we designed an open source interactive CR assessment application as a highly adaptive and efficient open source testing administration and analysis tool. It is capable…

  7. Performance of Fourth-Grade Students in the 2012 NAEP Computer-Based Writing Pilot Assessment: Scores, Text Length, and Use of Editing Tools. Working Paper Series. NCES 2015-119

    ERIC Educational Resources Information Center

    White, Sheida; Kim, Young Yee; Chen, Jing; Liu, Fei

    2015-01-01

    This study examined whether or not fourth-graders could fully demonstrate their writing skills on the computer and factors associated with their performance on the National Assessment of Educational Progress (NAEP) computer-based writing assessment. The results suggest that high-performing fourth-graders (those who scored in the upper 20 percent…

  8. SiGe BiCMOS manufacturing platform for mmWave applications

    NASA Astrophysics Data System (ADS)

    Kar-Roy, Arjun; Howard, David; Preisler, Edward; Racanelli, Marco; Chaudhry, Samir; Blaschke, Volker

    2010-10-01

    TowerJazz offers high volume manufacturable commercial SiGe BiCMOS technology platforms to address the mmWave market. In this paper, first, the SiGe BiCMOS process technology platforms such as SBC18 and SBC13 are described. These manufacturing platforms integrate 200 GHz fT/fMAX SiGe NPN with deep trench isolation into 0.18μm and 0.13μm node CMOS processes along with high density 5.6fF/μm2 stacked MIM capacitors, high value polysilicon resistors, high-Q metal resistors, lateral PNP transistors, and triple well isolation using deep n-well for mixed-signal integration, and, multiple varactors and compact high-Q inductors for RF needs. Second, design enablement tools that maximize performance and lowers costs and time to market such as scalable PSP and HICUM models, statistical and Xsigma models, reliability modeling tools, process control model tools, inductor toolbox and transmission line models are described. Finally, demonstrations in silicon for mmWave applications in the areas of optical networking, mobile broadband, phased array radar, collision avoidance radar and W-band imaging are listed.

  9. UTOPIA-User-Friendly Tools for Operating Informatics Applications.

    PubMed

    Pettifer, S R; Sinnott, J R; Attwood, T K

    2004-01-01

    Bioinformaticians routinely analyse vast amounts of information held both in large remote databases and in flat data files hosted on local machines. The contemporary toolkit available for this purpose consists of an ad hoc collection of data manipulation tools, scripting languages and visualization systems; these must often be combined in complex and bespoke ways, the result frequently being an unwieldy artefact capable of one specific task, which cannot easily be exploited or extended by other practitioners. Owing to the sizes of current databases and the scale of the analyses necessary, routine bioinformatics tasks are often automated, but many still require the unique experience and intuition of human researchers: this requires tools that support real-time interaction with complex datasets. Many existing tools have poor user interfaces and limited real-time performance when applied to realistically large datasets; much of the user's cognitive capacity is therefore focused on controlling the tool rather than on performing the research. The UTOPIA project is addressing some of these issues by building reusable software components that can be combined to make useful applications in the field of bioinformatics. Expertise in the fields of human computer interaction, high-performance rendering, and distributed systems is being guided by bioinformaticians and end-user biologists to create a toolkit that is both architecturally sound from a computing point of view, and directly addresses end-user and application-developer requirements.

  10. Planning That Matters: Helping Schools Engage in Collaborative, Strategic Problem Solving. Policy Brief

    ERIC Educational Resources Information Center

    Jerald, Craig

    2005-01-01

    Earlier this year, the Prichard Committee for Academic Excellence released a report highlighting practices in Kentucky's high-performing, high-poverty schools. Researchers collected information using the same audit tool that the Kentucky Department of Education uses to diagnose problems in schools identified for improvement, then compared those…

  11. GPURFSCREEN: a GPU based virtual screening tool using random forest classifier.

    PubMed

    Jayaraj, P B; Ajay, Mathias K; Nufail, M; Gopakumar, G; Jaleel, U C A

    2016-01-01

    In-silico methods are an integral part of modern drug discovery paradigm. Virtual screening, an in-silico method, is used to refine data models and reduce the chemical space on which wet lab experiments need to be performed. Virtual screening of a ligand data model requires large scale computations, making it a highly time consuming task. This process can be speeded up by implementing parallelized algorithms on a Graphical Processing Unit (GPU). Random Forest is a robust classification algorithm that can be employed in the virtual screening. A ligand based virtual screening tool (GPURFSCREEN) that uses random forests on GPU systems has been proposed and evaluated in this paper. This tool produces optimized results at a lower execution time for large bioassay data sets. The quality of results produced by our tool on GPU is same as that on a regular serial environment. Considering the magnitude of data to be screened, the parallelized virtual screening has a significantly lower running time at high throughput. The proposed parallel tool outperforms its serial counterpart by successfully screening billions of molecules in training and prediction phases.

  12. Supporting Scientific Analysis within Collaborative Problem Solving Environments

    NASA Technical Reports Server (NTRS)

    Watson, Velvin R.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    Collaborative problem solving environments for scientists should contain the analysis tools the scientists require in addition to the remote collaboration tools used for general communication. Unfortunately, most scientific analysis tools have been designed for a "stand-alone mode" and cannot be easily modified to work well in a collaborative environment. This paper addresses the questions, "What features are desired in a scientific analysis tool contained within a collaborative environment?", "What are the tool design criteria needed to provide these features?", and "What support is required from the architecture to support these design criteria?." First, the features of scientific analysis tools that are important for effective analysis in collaborative environments are listed. Next, several design criteria for developing analysis tools that will provide these features are presented. Then requirements for the architecture to support these design criteria are listed. Sonic proposed architectures for collaborative problem solving environments are reviewed and their capabilities to support the specified design criteria are discussed. A deficiency in the most popular architecture for remote application sharing, the ITU T. 120 architecture, prevents it from supporting highly interactive, dynamic, high resolution graphics. To illustrate that the specified design criteria can provide a highly effective analysis tool within a collaborative problem solving environment, a scientific analysis tool that contains the specified design criteria has been integrated into a collaborative environment and tested for effectiveness. The tests were conducted in collaborations between remote sites in the US and between remote sites on different continents. The tests showed that the tool (a tool for the visual analysis of computer simulations of physics) was highly effective for both synchronous and asynchronous collaborative analyses. The important features provided by the tool (and made possible by the specified design criteria) are: 1. The tool provides highly interactive, dynamic, high resolution, 3D graphics. 2. All remote scientists can view the same dynamic, high resolution, 3D scenes of the analysis as the analysis is being conducted. 3. The responsiveness of the tool is nearly identical to the responsiveness of the tool in a stand-alone mode. 4. The scientists can transfer control of the analysis between themselves. 5. Any analysis session or segment of an analysis session, whether done individually or collaboratively, can be recorded and posted on the Web for other scientists or students to download and play in either a collaborative or individual mode. 6. The scientist or student who downloaded the session can, individually or collaboratively, modify or extend the session with his/her own "what if" analysis of the data and post his/her version of the analysis back onto the Web. 7. The peak network bandwidth used in the collaborative sessions is only 1K bit/second even though the scientists at all sites are viewing high resolution (1280 x 1024 pixels), dynamic, 3D scenes of the analysis. The links between the specified design criteria and these performance features are presented.

  13. Process tool monitoring and matching using interferometry technique

    NASA Astrophysics Data System (ADS)

    Anberg, Doug; Owen, David M.; Mileham, Jeffrey; Lee, Byoung-Ho; Bouche, Eric

    2016-03-01

    The semiconductor industry makes dramatic device technology changes over short time periods. As the semiconductor industry advances towards to the 10 nm device node, more precise management and control of processing tools has become a significant manufacturing challenge. Some processes require multiple tool sets and some tools have multiple chambers for mass production. Tool and chamber matching has become a critical consideration for meeting today's manufacturing requirements. Additionally, process tools and chamber conditions have to be monitored to ensure uniform process performance across the tool and chamber fleet. There are many parameters for managing and monitoring tools and chambers. Particle defect monitoring is a well-known and established example where defect inspection tools can directly detect particles on the wafer surface. However, leading edge processes are driving the need to also monitor invisible defects, i.e. stress, contamination, etc., because some device failures cannot be directly correlated with traditional visualized defect maps or other known sources. Some failure maps show the same signatures as stress or contamination maps, which implies correlation to device performance or yield. In this paper we present process tool monitoring and matching using an interferometry technique. There are many types of interferometry techniques used for various process monitoring applications. We use a Coherent Gradient Sensing (CGS) interferometer which is self-referencing and enables high throughput measurements. Using this technique, we can quickly measure the topography of an entire wafer surface and obtain stress and displacement data from the topography measurement. For improved tool and chamber matching and reduced device failure, wafer stress measurements can be implemented as a regular tool or chamber monitoring test for either unpatterned or patterned wafers as a good criteria for improved process stability.

  14. Screening tools to identify patients with complex health needs at risk of high use of health care services: A scoping review

    PubMed Central

    Chouinard, Maud-Christine; Diadiou, Fatoumata; Dufour, Isabelle

    2017-01-01

    Background Many people with chronic conditions have complex health needs often due to multiple chronic conditions, psychiatric comorbidities, psychosocial issues, or a combination of these factors. They are at high risk of frequent use of healthcare services. To offer these patients interventions adapted to their needs, it is crucial to be able to identify them early. Objective The aim of this study was to find all existing screening tools that identify patients with complex health needs at risk of frequent use of healthcare services, and to highlight their principal characteristics. Our purpose was to find a short, valid screening tool to identify adult patients of all ages. Methods A scoping review was performed on articles published between 1985 and July 2016, retrieved through a comprehensive search of the Scopus and CINAHL databases, following the methodological framework developed by Arksey and O’Malley (2005), and completed by Levac et al. (2010). Results Of the 3,818 articles identified, 30 were included, presenting 14 different screening tools. Seven tools were self-reported. Five targeted adult patients, and nine geriatric patients. Two tools were designed for specific populations. Four can be completed in 15 minutes or less. Most screening tools target elderly persons. The INTERMED self-assessment (IM-SA) targets adults of all ages and can be completed in less than 15 minutes. Conclusion Future research could evaluate its usefulness as a screening tool for identifying patients with complex needs at risk of becoming high users of healthcare services. PMID:29190658

  15. Assessment tools for unrecognized myocardial infarction: a cross-sectional analysis of the REasons for geographic and racial differences in stroke population

    PubMed Central

    2013-01-01

    Background Routine electrocardiograms (ECGs) are not recommended for asymptomatic patients because the potential harms are thought to outweigh any benefits. Assessment tools to identify high risk individuals may improve the harm versus benefit profile of screening ECGs. In particular, people with unrecognized myocardial infarction (UMI) have elevated risk for cardiovascular events and death. Methods Using logistic regression, we developed a basic assessment tool among 16,653 participants in the REasons for Geographic and Racial Differences in Stroke (REGARDS) study using demographics, self-reported medical history, blood pressure, and body mass index and an expanded assessment tool using information on 51 potential variables. UMI was defined as electrocardiogram evidence of myocardial infarction without a self-reported history (n = 740). Results The basic assessment tool had a c-statistic of 0.638 (95% confidence interval 0.617 - 0.659) and included age, race, smoking status, body mass index, systolic blood pressure, and self-reported history of transient ischemic attack, deep vein thrombosis, falls, diabetes, and hypertension. A predicted probability of UMI > 3% provided a sensitivity of 80% and a specificity of 30%. The expanded assessment tool had a c-statistic of 0.654 (95% confidence interval 0.634-0.674). Because of the poor performance of these assessment tools, external validation was not pursued. Conclusions Despite examining a large number of potential correlates of UMI, the assessment tools did not provide a high level of discrimination. These data suggest defining groups with high prevalence of UMI for targeted screening will be difficult. PMID:23530553

  16. A practical assessment of physician biopsychosocial performance.

    PubMed

    Margalit, Alon Pa; Glick, Shimon M; Benbassat, Jochanan; Cohen, Ayala; Margolis, Carmi Z

    2007-10-01

    A biopsychosocial approach to care seems to improve patient satisfaction and health outcomes. Nevertheless, this approach is not widely practiced, possibly because its precepts have not been translated into observable skills. To identify the skill components of a biopsychosocial consultation and develop an tool for their evaluation. We approached three e-mail discussion groups of family physicians and pooled their responses to the question "what types of observed physician behavior would characterize a biopsychosocial consultation?" We received 35 responses describing 37 types of behavior, all of which seemed to cluster around one of three aspects: patient-centered interview; system-centered and family-centered approach to care; or problem-solving orientation. Using these categories, we developed a nine-item evaluation tool. We used the evaluation tool to score videotaped encounters of patients with two types of doctors: family physicians who were identified by peer ratings to have a highly biopsychosocial orientation (n = 9) or a highly biomedical approach (n = 4); and 44 general practitioners, before and after they had participated in a program that taught a biopsychosocial approach to care. The evaluation tool was found to demonstrate high reliability (alpha = 0.90) and acceptable interobserver variability. The average scores of the physicians with a highly biopsychosocial orientation were significantly higher than those of physicians with a highly biomedical approach. There were significant differences between the scores of the teaching-program participants before and after the program. A biopsychosocial approach to patient care can be characterized using a valid and easy-to-apply evaluation tool.

  17. External validation and comparison of three prediction tools for risk of osteoporotic fractures using data from population based electronic health records: retrospective cohort study

    PubMed Central

    Cohen-Stavi, Chandra; Leventer-Roberts, Maya; Balicer, Ran D

    2017-01-01

    Objective To directly compare the performance and externally validate the three most studied prediction tools for osteoporotic fractures—QFracture, FRAX, and Garvan—using data from electronic health records. Design Retrospective cohort study. Setting Payer provider healthcare organisation in Israel. Participants 1 054 815 members aged 50 to 90 years for comparison between tools and cohorts of different age ranges, corresponding to those in each tools’ development study, for tool specific external validation. Main outcome measure First diagnosis of a major osteoporotic fracture (for QFracture and FRAX tools) and hip fractures (for all three tools) recorded in electronic health records from 2010 to 2014. Observed fracture rates were compared to probabilities predicted retrospectively as of 2010. Results The observed five year hip fracture rate was 2.7% and the rate for major osteoporotic fractures was 7.7%. The areas under the receiver operating curve (AUC) for hip fracture prediction were 82.7% for QFracture, 81.5% for FRAX, and 77.8% for Garvan. For major osteoporotic fractures, AUCs were 71.2% for QFracture and 71.4% for FRAX. All the tools underestimated the fracture risk, but the average observed to predicted ratios and the calibration slopes of FRAX were closest to 1. Tool specific validation analyses yielded hip fracture prediction AUCs of 88.0% for QFracture (among those aged 30-100 years), 81.5% for FRAX (50-90 years), and 71.2% for Garvan (60-95 years). Conclusions Both QFracture and FRAX had high discriminatory power for hip fracture prediction, with QFracture performing slightly better. This performance gap was more pronounced in previous studies, likely because of broader age inclusion criteria for QFracture validations. The simpler FRAX performed almost as well as QFracture for hip fracture prediction, and may have advantages if some of the input data required for QFracture are not available. However, both tools require calibration before implementation. PMID:28104610

  18. Registration performance on EUV masks using high-resolution registration metrology

    NASA Astrophysics Data System (ADS)

    Steinert, Steffen; Solowan, Hans-Michael; Park, Jinback; Han, Hakseung; Beyer, Dirk; Scherübl, Thomas

    2016-10-01

    Next-generation lithography based on EUV continues to move forward to high-volume manufacturing. Given the technical challenges and the throughput concerns a hybrid approach with 193 nm immersion lithography is expected, at least in the initial state. Due to the increasing complexity at smaller nodes a multitude of different masks, both DUV (193 nm) and EUV (13.5 nm) reticles, will then be required in the lithography process-flow. The individual registration of each mask and the resulting overlay error are of crucial importance in order to ensure proper functionality of the chips. While registration and overlay metrology on DUV masks has been the standard for decades, this has yet to be demonstrated on EUV masks. Past generations of mask registration tools were not necessarily limited in their tool stability, but in their resolution capabilities. The scope of this work is an image placement investigation of high-end EUV masks together with a registration and resolution performance qualification. For this we employ a new generation registration metrology system embedded in a production environment for full-spec EUV masks. This paper presents excellent registration performance not only on standard overlay markers but also on more sophisticated e-beam calibration patterns.

  19. High-Fidelity Multidisciplinary Design Optimization of Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Martins, Joaquim R. R. A.; Kenway, Gaetan K. W.; Burdette, David; Jonsson, Eirikur; Kennedy, Graeme J.

    2017-01-01

    To evaluate new airframe technologies we need design tools based on high-fidelity models that consider multidisciplinary interactions early in the design process. The overarching goal of this NRA is to develop tools that enable high-fidelity multidisciplinary design optimization of aircraft configurations, and to apply these tools to the design of high aspect ratio flexible wings. We develop a geometry engine that is capable of quickly generating conventional and unconventional aircraft configurations including the internal structure. This geometry engine features adjoint derivative computation for efficient gradient-based optimization. We also added overset capability to a computational fluid dynamics solver, complete with an adjoint implementation and semiautomatic mesh generation. We also developed an approach to constraining buffet and started the development of an approach for constraining utter. On the applications side, we developed a new common high-fidelity model for aeroelastic studies of high aspect ratio wings. We performed optimal design trade-o s between fuel burn and aircraft weight for metal, conventional composite, and carbon nanotube composite wings. We also assessed a continuous morphing trailing edge technology applied to high aspect ratio wings. This research resulted in the publication of 26 manuscripts so far, and the developed methodologies were used in two other NRAs. 1

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Lizhen; Yang, Ying; Tyburska-Puschel, Beata

    The mission of the Nuclear Energy Enabling Technologies (NEET) program is to develop crosscutting technologies for nuclear energy applications. Advanced structural materials with superior performance at elevated temperatures are always desired for nuclear reactors, which can improve reactor economics, safety margins, and design flexibility. They benefit not only new reactors, including advanced light water reactors (LWRs) and fast reactors such as sodium-cooled fast reactor (SFR) that is primarily designed for management of high-level wastes, but also life extension of the existing fleet when component exchange is needed. Developing and utilizing the modern materials science tools (experimental, theoretical, and computational tools)more » is an important path to more efficient alloy development and process optimization. Ferritic-martensitic (FM) steels are important structural materials for nuclear reactors due to their advantages over other applicable materials like austenitic stainless steels, notably their resistance to void swelling, low thermal expansion coefficients, and higher thermal conductivity. However, traditional FM steels exhibit a noticeable yield strength reduction at elevated temperatures above ~500°C, which limits their applications in advanced nuclear reactors which target operating temperatures at 650°C or higher. Although oxide-dispersion-strengthened (ODS) ferritic steels have shown excellent high-temperature performance, their extremely high cost, limited size and fabricability of products, as well as the great difficulty with welding and joining, have limited or precluded their commercial applications. Zirconium has shown many benefits to Fe-base alloys such as grain refinement, improved phase stability, and reduced radiation-induced segregation. The ultimate goal of this project is, with the aid of computational modeling tools, to accelerate the development of a new generation of Zr-bearing ferritic alloys to be fabricated using conventional steelmaking practices, which have excellent radiation resistance and enhanced high-temperature creep performance greater than Grade 91.« less

  1. Nanocomposites for Machining Tools

    PubMed Central

    Loginov, Pavel; Mishnaevsky, Leon; Levashov, Evgeny

    2017-01-01

    Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials. A promising way to improve the performance characteristics of these materials is to design new nanocomposites based on them. The application of micromechanical modeling during the elaboration of composite materials for machining tools can reduce the financial and time costs for development of new tools, with enhanced performance. This article reviews the main groups of nanocomposites for machining tools and their performance. PMID:29027926

  2. Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale

    PubMed Central

    Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv

    2015-01-01

    X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner. PMID:26169570

  3. Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale

    NASA Astrophysics Data System (ADS)

    Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv

    2015-07-01

    X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner.

  4. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  5. Validation of the self-assessment teamwork tool (SATT) in a cohort of nursing and medical students.

    PubMed

    Roper, Lucinda; Shulruf, Boaz; Jorm, Christine; Currie, Jane; Gordon, Christopher J

    2018-02-09

    Poor teamwork has been implicated in medical error and teamwork training has been shown to improve patient care. Simulation is an effective educational method for teamwork training. Post-simulation reflection aims to promote learning and we have previously developed a self-assessment teamwork tool (SATT) for health students to measure teamwork performance. This study aimed to evaluate the psychometric properties of a revised self-assessment teamwork tool. The tool was tested in 257 medical and nursing students after their participation in one of several mass casualty simulations. Using exploratory and confirmatory factor analysis, the revised self-assessment teamwork tool was shown to have strong construct validity, high reliability, and the construct demonstrated invariance across groups (Medicine & Nursing). The modified SATT was shown to be a reliable and valid student self-assessment tool. The SATT is a quick and practical method of guiding students' reflection on important teamwork skills.

  6. Performance evaluation of MPEG internet video coding

    NASA Astrophysics Data System (ADS)

    Luo, Jiajia; Wang, Ronggang; Fan, Kui; Wang, Zhenyu; Li, Ge; Wang, Wenmin

    2016-09-01

    Internet Video Coding (IVC) has been developed in MPEG by combining well-known existing technology elements and new coding tools with royalty-free declarations. In June 2015, IVC project was approved as ISO/IEC 14496-33 (MPEG- 4 Internet Video Coding). It is believed that this standard can be highly beneficial for video services in the Internet domain. This paper evaluates the objective and subjective performances of IVC by comparing it against Web Video Coding (WVC), Video Coding for Browsers (VCB) and AVC High Profile. Experimental results show that IVC's compression performance is approximately equal to that of the AVC High Profile for typical operational settings, both for streaming and low-delay applications, and is better than WVC and VCB.

  7. Scalability Analysis of Gleipnir: A Memory Tracing and Profiling Tool, on Titan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janjusic, Tommy; Kartsaklis, Christos; Wang, Dali

    2013-01-01

    Application performance is hindered by a variety of factors but most notably driven by the well know CPU-memory speed gap (also known as the memory wall). Understanding application s memory behavior is key if we are trying to optimize performance. Understanding application performance properties is facilitated with various performance profiling tools. The scope of profiling tools varies in complexity, ease of deployment, profiling performance, and the detail of profiled information. Specifically, using profiling tools for performance analysis is a common task when optimizing and understanding scientific applications on complex and large scale systems such as Cray s XK7. This papermore » describes the performance characteristics of using Gleipnir, a memory tracing tool, on the Titan Cray XK7 system when instrumenting large applications such as the Community Earth System Model. Gleipnir is a memory tracing tool built as a plug-in tool for the Valgrind instrumentation framework. The goal of Gleipnir is to provide fine-grained trace information. The generated traces are a stream of executed memory transactions mapped to internal structures per process, thread, function, and finally the data structure or variable. Our focus was to expose tool performance characteristics when using Gleipnir with a combination of an external tools such as a cache simulator, Gl CSim, to characterize the tool s overall performance. In this paper we describe our experience with deploying Gleipnir on the Titan Cray XK7 system, report on the tool s ease-of-use, and analyze run-time performance characteristics under various workloads. While all performance aspects are important we mainly focus on I/O characteristics analysis due to the emphasis on the tools output which are trace-files. Moreover, the tool is dependent on the run-time system to provide the necessary infrastructure to expose low level system detail; therefore, we also discuss any theoretical benefits that can be achieved if such modules were present.« less

  8. Cold machining of high density tungsten and other materials

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1969-01-01

    Cold machining process, which uses a sub-zero refrigerated cutting fluid, is used for machining refractory or reactive metals and alloys. Special carbide tools for turning and drilling these alloys further improve the cutting performance.

  9. Automation Bias: Decision Making and Performance in High-Tech Cockpits

    NASA Technical Reports Server (NTRS)

    Mosier, Kathleen L.; Skitka, Linda J.; Heers, Susan; Burdick, Mark; Rosekind, Mark R. (Technical Monitor)

    1997-01-01

    Automated aids and decision support tools are rapidly becoming indispensible tools in high-technology cockpits, and are assuming increasing control of "cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate "automation bias," a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation "events," or opportunities for automation-related omission and commission errors. Pilots who perceived themselves as "accountable" for their performance and strategies of interaction with the automation were more likely to double-check automated functioning against other cues, and less likely to commit errors. Pilots were also likely to erroneously "remember" the presence of expected cues when describing their decision-making processes.

  10. ScanRanker: Quality Assessment of Tandem Mass Spectra via Sequence Tagging

    PubMed Central

    Ma, Ze-Qiang; Chambers, Matthew C.; Ham, Amy-Joan L.; Cheek, Kristin L.; Whitwell, Corbin W.; Aerni, Hans-Rudolf; Schilling, Birgit; Miller, Aaron W.; Caprioli, Richard M.; Tabb, David L.

    2011-01-01

    In shotgun proteomics, protein identification by tandem mass spectrometry relies on bioinformatics tools. Despite recent improvements in identification algorithms, a significant number of high quality spectra remain unidentified for various reasons. Here we present ScanRanker, an open-source tool that evaluates the quality of tandem mass spectra via sequence tagging with reliable performance in data from different instruments. The superior performance of ScanRanker enables it not only to find unassigned high quality spectra that evade identification through database search, but also to select spectra for de novo sequencing and cross-linking analysis. In addition, we demonstrate that the distribution of ScanRanker scores predicts the richness of identifiable spectra among multiple LC-MS/MS runs in an experiment, and ScanRanker scores assist the process of peptide assignment validation to increase confident spectrum identifications. The source code and executable versions of ScanRanker are available from http://fenchurch.mc.vanderbilt.edu. PMID:21520941

  11. Carbon and metal-carbon implantations into tool steels for improved tribological performance

    NASA Astrophysics Data System (ADS)

    Hirvonen, J.-P.; Harskamp, F.; Torri, P.; Willers, H.; Fusari, A.; Gibson, N.; Haupt, J.

    1997-05-01

    The high-fluence implantation of carbon and dual implantations of metal-metalloid pairs into steels with different microstructures are briefly reviewed. A previously unexamined system, the implantation of Si and C into two kinds of tool steels, M3 and D2, have been studied in terms of microstructure and tribological performance. In both cases ion implantation transfers a surface into an amorphous layer. However, the tribological behavior of these two materials differs remarkably: in the case of ion-implanted M3 a reduction of wear in a steel pin is observed even at high pin loads, whereas in the case of ion-implanted D2 the beneficial effects of ion implantation were limited to the lowest pin load. The importance of an initial phase at the onset of sliding is emphasized and a number of peculiarities observed in ion-implanted M3 steel are discussed.

  12. Nuclear Tools For Oilfield Logging-While-Drilling Applications

    NASA Astrophysics Data System (ADS)

    Reijonen, Jani

    2011-06-01

    Schlumberger is an international oilfield service company with nearly 80,000 employees of 140 nationalities, operating globally in 80 countries. As a market leader in oilfield services, Schlumberger has developed a suite of technologies to assess the downhole environment, including, among others, electromagnetic, seismic, chemical, and nuclear measurements. In the past 10 years there has been a radical shift in the oilfield service industry from traditional wireline measurements to logging-while-drilling (LWD) analysis. For LWD measurements, the analysis is performed and the instruments are operated while the borehole is being drilled. The high temperature, high shock, and extreme vibration environment of LWD imposes stringent requirements for the devices used in these applications. This has a significant impact on the design of the components and subcomponents of a downhole tool. Another significant change in the past few years for nuclear-based oilwell logging tools is the desire to replace the sealed radioisotope sources with active, electronic ones. These active radiation sources provide great benefits compared to the isotopic sources, ranging from handling and safety to nonproliferation and well contamination issues. The challenge is to develop electronic generators that have a high degree of reliability for the entire lifetime of a downhole tool. LWD tool testing and operations are highlighted with particular emphasis on electronic radiation sources and nuclear detectors for the downhole environment.

  13. The NASA high power carbon dioxide laser: A versatile tool for laser applications

    NASA Technical Reports Server (NTRS)

    Lancashire, R. B.; Alger, D. L.; Manista, E. J.; Slaby, J. G.; Dunning, J. W.; Stubbs, R. M.

    1976-01-01

    A closed-cycle, continuous wave, carbon dioxide high power laser has been designed and fabricated to support research for the identification and evaluation of possible high power laser applications. The device is designed to generate up to 70 kW of laser power in annular shape beams from 1 to 9 cm in diameter. Electric discharge, either self sustained or electron beam sustained, is used for excitation. This laser facility provides a versatile tool on which research can be performed to advance the state-of-the-art technology of high power CO2 lasers in such areas as electric excitation, laser chemistry, and quality of output beams. The facility provides a well defined, continuous wave beam for various application experiments, such as propulsion, power conversion, and materials processing.

  14. U.S. Team Green Building Challenge 2002

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2002-09-01

    Flier about the U.S. Team and its projects participating in the International Green Building Challenge. Along with many other countries, the United States accepted the Green Building Challenge (GBC), an international effort to evaluate and improve the performance of buildings worldwide. GBC started out in 1996 as a competition to determine which country had the greenest buildings; it evolved into a cooperative process among the countries to measure the performance of green buildings. Although the auto industry can easily measure efficiency in terms of miles per gallon, the buildings industry has no standard way to quantify energy and environmental performance.more » The Green Building Challenge participants hope that better tools for measuring the energy and environmental performance of buildings will be an outcome of their efforts and that these tools will lead to higher and better performance levels in buildings around the world. The ultimate goal is to design, construct, and operate buildings that contribute to global sustainability by conserving and/or regenerating natural resources and minimizing nonrenewable energy use. The United States' Green Building Challenge Team '02 selected five buildings from around the country to serve as case studies; each of the five U.S. building designs (as well as all international case studies) were assessed using an in-depth evaluation tool, called the Green Building Assessment Tool (GBTool). The GBTool was specifically created and refined by international teams, for the GBC efforts. The goal of this collaborative effort is to improve this evaluation software tool so that it can be used globally, while taking into account regional and national conditions. The GBTool was used by the U.S. Team to assess and evaluate the energy and environmental performance of these five buildings: (1) Retail (in operation): BigHorn Home Improvement Center, Silverthorne, Colorado; (2) Office (in operation), Philip Merrill Environmental; (3) School (in construction), Clearview Elementary School, Hanover, Pennsylvania; (4) Multi-family residential (in construction), Twenty River Terrace, Battery Park City, New York; and (5) Office/lab (in design), National Oceanic Atmospheric Administration, Honolulu, Hawaii. These projects were selected, not only because they were good examples of high-performance buildings and had interested owners/design team members, but also because building data was available as inputs to test the software tool. Both the tool and the process have been repeatedly refined and enhanced since the first Green Building Challenge event in 1998; participating countries are continuously providing feedback to further improve the tool and global process for the greatest positive effect.« less

  15. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    NASA Astrophysics Data System (ADS)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  16. Influence of Wake Models on Calculated Tiltrotor Aerodynamics

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne

    2001-01-01

    The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.

  17. Methods for Evaluating the Performance and Human Stress-Factors of Percussive Riveting

    NASA Astrophysics Data System (ADS)

    Ahn, Jonathan Y.

    The aerospace industry automates portions of their manufacturing and assembly processes. However, mechanics still remain vital to production, especially in areas where automated machines cannot fit, or have yet to match the quality of human craftsmanship. One such task is percussive riveting. Because percussive riveting is associated with a high risk of injury, these tool must be certified prior to release. The major contribution of this thesis is to develop a test bench capable of percussive riveting for ergonomic evaluation purposes. The major issues investigated are: (i) automate the tool evaluation method to be repeatable; (ii) demonstrate use of displacement and force sensors; and (iii) correlate performance and risk exposure of percussive tools. A test bench equipped with servomotors and pneumatic cylinders to control xyz-position of a rivet gun and bucking bar simultaneously, is used to explore this evaluation approach.

  18. Status of Low Thrust Work at JSC

    NASA Technical Reports Server (NTRS)

    Condon, Gerald L.

    2004-01-01

    High performance low thrust (solar electric, nuclear electric, variable specific impulse magnetoplasma rocket) propulsion offers a significant benefit to NASA missions beyond low Earth orbit. As NASA (e.g., Prometheus Project) endeavors to develop these propulsion systems and associated power supplies, it becomes necessary to develop a refined trajectory design capability that will allow engineers to develop future robotic and human mission designs that take advantage of this new technology. This ongoing work addresses development of a trajectory design and optimization tool for assessing low thrust (and other types) trajectories. This work targets to advance the state of the art, enable future NASA missions, enable science drivers, and enhance education. This presentation provides a summary of the low thrust-related JSC activities under the ISP program and specifically, provides a look at a new release of a multi-gravity, multispacecraft trajectory optimization tool (Copernicus) along with analysis performed using this tool over the past year.

  19. Clinical peer review program self-evaluation for US hospitals.

    PubMed

    Edwards, Marc T

    2010-01-01

    Prior research has shown wide variation in clinical peer review program structure, process, governance, and perceived effectiveness. This study sought to validate the utility of a Peer Review Program Self-Evaluation Tool as a potential guide to physician and hospital leaders seeking greater program value. Data from 330 hospitals show that the total score from the self-evaluation tool is strongly associated with perceived quality impact. Organizational culture also plays a significant role. When controlling for these factors, there was no evidence of benefit from a multispecialty review process. Physicians do not generally use reliable methods to measure clinical performance. A high rate of change since 2007 has not produced much improvement. The Peer Review Program Self-Evaluation Tool reliably differentiates hospitals along a continuum of perceived program performance. The full potential of peer review as a process to improve the quality and safety of care has yet to be realized.

  20. Investigating bone chip formation in craniotomy.

    PubMed

    Huiyu, He; Chengyong, Wang; Yue, Zhang; Yanbin, Zheng; Linlin, Xu; Guoneng, Xie; Danna, Zhao; Bin, Chen; Haoan, Chen

    2017-10-01

    In a craniotomy, the milling cutter is one of the most important cutting tools. The operating performance, tool durability and cutting damage to patients are influenced by the tool's sharpness, intensity and structure, whereas the cutting characteristics rely on interactions between the tool and the skull. In this study, an orthogonal cutting experiment during a craniotomy of fresh pig skulls was performed to investigate chip formation on the side cutting and face cutting of the skull using a high-speed camera. The cutting forces with different combinations of cutting parameters, such as the rake angle, clearance angle, depth of cut and cutting speed, were measured. The skull bone microstructure and cutting damage were observed by scanning electron microscope. Cutting models for different cutting approaches and various depths of cut were constructed and analyzed. The study demonstrated that the effects of shearing, tension and extrusion occur during chip formation. Various chip types, such as unit chips, splintering chips and continuous chips, were generated. Continuous pieces of chips, which are advisable for easy removal from the field of operation, were formed at greater depths of cut and tool rake angles greater than 10°. Cutting damage could be relieved with a faster recovery with clearance angles greater than 20°.

  1. Demonstration of Cost-Effective, High-Performance Computing at Performance and Reliability Levels Equivalent to a 1994 Vector Supercomputer

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2000-01-01

    The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.

  2. External Validation of a Decision Tool To Guide Post-Operative Management of Patients with Secondary Peritonitis.

    PubMed

    Atema, Jasper J; Ram, Kim; Schultz, Marcus J; Boermeester, Marja A

    Timely identification of patients in need of an intervention for abdominal sepsis after initial surgical management of secondary peritonitis is vital but complex. The aim of this study was to validate a decision tool for this purpose and to evaluate its potential to guide post-operative management. A prospective cohort study was conducted on consecutive adult patients undergoing surgery for secondary peritonitis in a single hospital. Assessments using the decision tool, based on one intra-operative and five post-operative variables, were performed on the second and third post-operative days and when the patients' clinical status deteriorated. Scores were compared with the clinical reference standard of persistent sepsis based on the clinical course or findings at imaging or surgery. Additionally, the potential of the decision tool to guide management in terms of diagnostic imaging in three previously defined score categories (low, intermediate, and high) was evaluated. A total of 161 assessments were performed in 69 patients. The majority of cases of secondary peritonitis (68%) were caused by perforation of the gastrointestinal tract. Post-operative persistent sepsis occurred in 28 patients. The discriminative capacity of the decision tool score was fair (area under the curve of the receiver operating characteristic = 0.79). The incidence rate differed significantly between the three score categories (p < 0.001). The negative predictive value of a decision tool score categorized as low probability was 89% (95% confidence interval [CI] 82-94) and 65% (95% CI 47-79) for an intermediate score. Diagnostic imaging was performed more frequently when there was an intermediate score than when the score was categorized as low (46% vs. 24%; p < 0.001). In patients operated on for secondary peritonitis, the decision tool score predicts with fair accuracy whether persistent sepsis is present.

  3. Putting Kids on the Pathway to College: How Is Your School Doing? The College Pathways Tools

    ERIC Educational Resources Information Center

    Annenberg Institute for School Reform at Brown University (NJ1), 2010

    2010-01-01

    The College Pathways series grew out of the findings in "Beating the Odds," a study of thirteen high-performing New York City high schools by Carol Ascher and Cindy Maguire for the Annenberg Institute for School Reform. Each of the schools admitted ninth-graders with high poverty rates and far-below-average reading and math scores but produced…

  4. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  5. A survey on annotation tools for the biomedical literature.

    PubMed

    Neves, Mariana; Leser, Ulf

    2014-03-01

    New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.

  6. Nested Interrupt Analysis of Low Cost and High Performance Embedded Systems Using GSPN Framework

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    Interrupt service routines are a key technology for embedded systems. In this paper, we introduce the standard approach for using Generalized Stochastic Petri Nets (GSPNs) as a high-level model for generating CTMC Continuous-Time Markov Chains (CTMCs) and then use Markov Reward Models (MRMs) to compute the performance for embedded systems. This framework is employed to analyze two embedded controllers with low cost and high performance, ARM7 and Cortex-M3. Cortex-M3 is designed with a tail-chaining mechanism to improve the performance of ARM7 when a nested interrupt occurs on an embedded controller. The Platform Independent Petri net Editor 2 (PIPE2) tool is used to model and evaluate the controllers in terms of power consumption and interrupt overhead performance. Using numerical results, in spite of the power consumption or interrupt overhead, Cortex-M3 performs better than ARM7.

  7. Left centro-parieto-temporal response to tool-gesture incongruity: an ERP study.

    PubMed

    Chang, Yi-Tzu; Chen, Hsiang-Yu; Huang, Yuan-Chieh; Shih, Wan-Yu; Chan, Hsiao-Lung; Wu, Ping-Yi; Meng, Ling-Fu; Chen, Chen-Chi; Wang, Ching-I

    2018-03-13

    Action semantics have been investigated in relation to context violation but remain less examined in relation to the meaning of gestures. In the present study, we examined tool-gesture incongruity by event-related potentials (ERPs) and hypothesized that the component N400, a neural index which has been widely used in both linguistic and action semantic congruence, is significant for conditions of incongruence. Twenty participants performed a tool-gesture judgment task, in which they were asked to judge whether the tool-gesture pairs were correct or incorrect, for the purpose of conveying functional expression of the tools. Online electroencephalograms and behavioral performances (the accuracy rate and reaction time) were recorded. The ERP analysis showed a left centro-parieto-temporal N300 effect (220-360 ms) for the correct condition. However, the expected N400 (400-550 ms) could not be differentiated between correct/incorrect conditions. After 700 ms, a prominent late negative complex for the correct condition was also found in the left centro-parieto-temporal area. The neurophysiological findings indicated that the left centro-parieto-temporal area is the predominant region contributing to neural processing for tool-gesture incongruity in right-handers. The temporal dynamics of tool-gesture incongruity are: (1) firstly enhanced for recognizable tool-gesture using patterns, (2) and require a secondary reanalysis for further examination of the highly complicated visual structures of gestures and tools. The evidence from the tool-gesture incongruity indicated altered brain activities attributable to the N400 in relation to lexical and action semantics. The online interaction between gesture and tool processing provided minimal context violation or anticipation effect, which may explain the missing N400.

  8. Impact of tool wear on cross wedge rolling process stability and on product quality

    NASA Astrophysics Data System (ADS)

    Gutierrez, Catalina; Langlois, Laurent; Baudouin, Cyrille; Bigot, Régis; Fremeaux, Eric

    2017-10-01

    Cross wedge rolling (CWR) is a metal forming process used in the automotive industry. One of its applications is in the manufacturing process of connecting rods. CWR transforms a cylindrical billet into a complex axisymmetrical shape with an accurate distribution of material. This preform is forged into shape in a forging die. In order to improve CWR tool lifecycle and product quality it is essential to understand tool wear evolution and the physical phenomena that change on the CWR process due to the resulting geometry of the tool when undergoing tool wear. In order to understand CWR tool wear behavior, numerical simulations are necessary. Nevertheless, if the simulations are performed with the CAD geometry of the tool, results are limited. To solve this difficulty, two numerical simulations with FORGE® were performed using the real geometry of the tools (both up and lower roll) at two different states: (1) before starting lifecycle and (2) end of lifecycle. The tools were 3D measured with ATOS triple scan by GOM® using optical 3D measuring techniques. The result was a high-resolution point cloud of the entire geometry of the tool. Each 3D point cloud was digitalized and converted into a STL format. The geometry of the tools in a STL format was input for the 3D simulations. Both simulations were compared. Defects of products obtained in simulation were compared to main defects of products found industrially. Two main defects are: (a) surface defects on the preform that are not fixed in the die forging operation; and (b) Preform bent (no longer straight), with two possible impacts: on the one hand that the robot cannot grab it to take it to the forging stage; on the other hand, an unfilled section in the forging operation.

  9. High volume nanoscale roll-based imprinting using jet and flash imprint lithography

    NASA Astrophysics Data System (ADS)

    Ahn, Se Hyun; Miller, Mike; Yang, Shuqiang; Ganapathisubramanian, Maha; Menezes, Marlon; Singh, Vik; Choi, Jin; Xu, Frank; LaBrake, Dwayne; Resnick, Douglas J.; Sreenivasan, S. V.

    2013-09-01

    Extremely large-area roll-to-roll (R2R) manufacturing on flexible substrates is ubiquitous for applications such as paper and plastic processing. It combines the benefits of high speed and inexpensive substrates to deliver a commodity product at low cost. The challenge is to extend this approach to the realm of nanopatterning and realize similar benefits. In order to achieve low-cost nanopatterning, it is imperative to move toward high-speed imprinting, less complex tools, near zero waste of consumables, and low-cost substrates. We have developed a roll-based J-FIL process and applied it to a technology demonstrator tool, the LithoFlex 100, to fabricate large-area flexible bilayer wire-grid polarizers (WGPs) and high-performance WGPs on rigid glass substrates. Extinction ratios of better than 10,000 are obtained for the glass-based WGPs. Two simulation packages are also employed to understand the effects of pitch, aluminum thickness, and pattern defectivity on the optical performance of the WGP devices. It is determined that the WGPs can be influenced by both clear and opaque defects in the gratings; however, the defect densities are relaxed relative to the requirements of a high-density semiconductor device.

  10. A Catalog of Performance Objectives, Performance Conditions, and Performance Guides for Machine Tool Operations.

    ERIC Educational Resources Information Center

    Stadt, Ronald; And Others

    This catalog provides performance objectives, tasks, standards, and performance guides associated with current occupational information relating to the job content of machinists, specifically tool grinder operators, production lathe operators, and production screw machine operators. The catalog is comprised of 262 performance objectives, tool and…

  11. [Assessment of a supervision grid being used in the laboratories of cutaneous leishmaniasis in Morocco].

    PubMed

    El Mansouri, Bouchra; Amarir, Fatima; Hajli, Yamina; Fellah, Hajiba; Sebti, Faiza; Delouane, Bouchra; Sadak, Abderrahim; Adlaoui, El Bachir; Rhajaoui, Mohammed

    2017-01-01

    The aim of our study was to assess a standardized supervisory grid as a new supervision tool being used in the laboratories of leishmaniasis. We conducted a pilot trial to evaluate the ongoing performances of seven provincial laboratories, in four provinces in Morocco, over a period of two years, between 2006 and 2014. This study detailed the situation in provincial laboratories before and after the implementation of the supervisory grid. A total of twenty-one grids were analyzed. In 2006, the results clearly showed a poor performance of laboratories: need for training (41.6%), staff performing skin biopsy (25%), shortage of materials and reagents (65%), non-compliant document and local management (85%). Several corrective actions were conducted by the National Reference Laboratory (LNRL) of Leishmaniasis during the study period. In 2014, the LNRL recorded a net improvement of the performances of the laboratories. The need for training, the quality of the biopsy, the supply of tools and reagents were met and an effective coordination activity was established between the LNRL and the provincial laboratories. This trial shows the effectiveness of the grid as a high quality supervisory tool and as a cornerstone of making progress on fight programs against leishmaniases.

  12. Evaluation of the King-Devick test as a concussion screening tool in high school football players.

    PubMed

    Seidman, Daniel H; Burlingame, Jennifer; Yousif, Lina R; Donahue, Xinh P; Krier, Joshua; Rayes, Lydia J; Young, Rachel; Lilla, Muareen; Mazurek, Rochelle; Hittle, Kristie; McCloskey, Charles; Misra, Saroj; Shaw, Michael K

    2015-09-15

    Concussion is the most common type of traumatic brain injury, and results from impact or impulsive forces to the head, neck or face. Due to the variability and subtlety of symptoms, concussions may go unrecognized or be ignored, especially with the pressure placed on athletes to return to competition. The King-Devick (KD) test, an oculomotor test originally designed for reading evaluation, was recently validated as a concussion screening tool in collegiate athletes. A prospective study was performed using high school football players in an attempt to study the KD as a concussion screening tool in this younger population. 343 athletes from four local high school football teams were recruited to participate. These athletes were given baseline KD tests prior to competition. Individual demographic information was collected on the subjects. Standard team protocol was employed to determine if a concussion had occurred during competition. Immediately after diagnosis, the KD test was re-administered to the concussed athlete for comparison to baseline. Post-season testing was also performed in non-concussed individuals. Of the 343 athletes, nine were diagnosed with concussions. In all concussed players, cumulative read times for the KD test were significantly increased (p<0.001). Post-season testing of non-concussed athletes revealed minimal change in read times relative to baseline. Univariate analysis revealed that history of concussion was the only demographic factor predictive of concussion in this cohort. The KD test is an accurate and easily administered sideline screening tool for concussion in adolescent football players. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Metal Matrix Composite LOX Turbopump Housing via Novel Tool-less Net-Shape Pressure Infiltration Casting Technology

    NASA Technical Reports Server (NTRS)

    Shah, Sandeep; Lee, Jonathan; Bhat, Biliyar; Wells, Doug; Gregg, Wayne; Marsh, Matthew; Genge, Gary; Forbes, John; Salvi, Alex; Cornie, James A.

    2003-01-01

    Metal matrix composites for propulsion components offer high performance and affordability, resulting in low weight and cost. The following sections in this viewgraph presentation describe the pressure infiltration casting of a metal matrix composite LOX turbopump housing: 1) Baseline Pump Design and Stress Analysis; 2) Tool-less Advanced Pressure Infiltration Casting Process; 3) Preform Splicing and Joining for Large Components such as Pump Housing; 4) Fullscale Pump Housing Redesign.

  14. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  15. Development of a Systems Engineering Competency Model Tool for the Aviation and Missile Research, Development, And Engineering Center (AMRDEC)

    DTIC Science & Technology

    2017-06-01

    The Naval Postgraduate School has developed a competency model for the systems engineering profession and is implementing a tool to support high...stakes human resource functions for the U.S. Army. A systems engineering career competency model (SECCM), recently developed by the Navy and verified by...the Office of Personnel Management (OPM), defines the critical competencies for successful performance as a systems engineer at each general schedule

  16. High Frequency Scattering Code in a Distributed Processing Environment

    DTIC Science & Technology

    1991-06-01

    Block 6. Author(s). Name(s) of person (s) Block 14. Subiect Terms. Keywords or phrases responsible for writing the report, performing identifying major...use of auttomated analysis tools is indicated. One tool developed by Pacific-Sierra Re- 22 search Corporation and marketed by Intel Corporation for...XQ: EXECUTE CODE EN : END CODE This input deck differs from that in the manual because the "PP" option is disabled in the modified code. 45 A.3

  17. 3D detectors with high space and time resolution

    NASA Astrophysics Data System (ADS)

    Loi, A.

    2018-01-01

    For future high luminosity LHC experiments it will be important to develop new detector systems with increased space and time resolution and also better radiation hardness in order to operate in high luminosity environment. A possible technology which could give such performances is 3D silicon detectors. This work explores the possibility of a pixel geometry by designing and simulating different solutions, using Sentaurus Tecnology Computer Aided Design (TCAD) as design and simulation tool, and analysing their performances. A key factor during the selection was the generated electric field and the carrier velocity inside the active area of the pixel.

  18. Patient-specific simulation in carotid artery stenting.

    PubMed

    Willaert, Willem; Aggarwal, Rajesh; Bicknell, Colin; Hamady, Mo; Darzi, Ara; Vermassen, Frank; Cheshire, Nicholas

    2010-12-01

    Patient-specific virtual reality (VR) simulation is a technologic advancement that allows planning and practice of the carotid artery stenting (CAS) procedure before it is performed on the patient. The initial findings are reported, using this novel VR technique as a tool to optimize technical and nontechnical aspects of this complex endovascular procedure. In the angiography suite, the same interventional team performed the VR rehearsal and the actual CAS on the patient. All proceedings were recorded to allow for video analysis of team, technical, and nontechnical skills. Analysis of both procedures showed identical use of endovascular tools, similar access strategy, and a high degree of similarity between the angiography images. The total procedure time (24.04 vs 60.44 minutes), fluoroscopy time (11.19 vs 21.04 minutes), and cannulation of the common carotid artery (1.35 vs 9.34) took considerably longer in reality. An extensive questionnaire revealed that all team members found that the rehearsal increased the subjective sense of teamwork (4/5), communication (4/5), and patient safety (4/5). A VR procedure rehearsal is a practical and feasible preparatory tool for CAS and shows a high correlation with the real procedure. It has the potential to enhance the technical, nontechnical, and team performance. Further research is needed to evaluate if this technology can lead to improved outcomes for patients. Copyright © 2010 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  19. Use of third-party aircraft performance tools in the development of the Aviation Environmental Design Tool (AEDT).

    DOT National Transportation Integrated Search

    2011-07-01

    This report documents work done to enhance terminal area aircraft performance modeling in the Federal : Aviation Administration's Aviation Environmental Design Tool. A commercially available aircraft : performance software tool was used to develop da...

  20. The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.

    PubMed

    Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker

    2016-01-01

    The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.

  1. Strategy Guideline: Quality Management in Existing Homes; Cantilever Floor Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taggart, J.; Sikora, J.; Wiehagen, J.

    2011-12-01

    This guideline is designed to highlight the QA process that can be applied to any residential building retrofit activity. The cantilevered floor retrofit detailed in this guideline is included only to provide an actual retrofit example to better illustrate the QA activities being presented. The goal of existing home high performing remodeling quality management systems (HPR-QMS) is to establish practices and processes that can be used throughout any remodeling project. The research presented in this document provides a comparison of a selected retrofit activity as typically done versus that same retrofit activity approached from an integrated high performance remodeling andmore » quality management perspective. It highlights some key quality management tools and approaches that can be adopted incrementally by a high performance remodeler for this or any high performance retrofit. This example is intended as a template and establishes a methodology that can be used to develop a portfolio of high performance remodeling strategies.« less

  2. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    PubMed

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  3. Can All Doctors Be Like This? Seven Stories of Communication Transformation Told by Physicians Rated Highest by Patients.

    PubMed

    Janisse, Tom; Tallman, Karen

    2017-01-01

    The top predictors of patient satisfaction with clinical visits are the quality of the physician-patient relationship and the communications contributing to their relationship. How do physicians improve their communication, and what effect does it have on them? This article presents the verbatim stories of seven high-performing physicians describing their transformative change in the areas of communication, connection, and well-being. Data for this study are based on interviews from a previous study in which a 6-question set was posed, in semistructured 60-minute interviews, to 77 of the highest-performing Permanente Medical Group physicians in 4 Regions on the "Art of Medicine" patient survey. Transformation stories emerged spontaneously during the interviews, and so it was an incidental finding when some physicians identified that they were not always high performing in their communication with patients. Seven different modes of transformation in communication were described by these physicians: a listening tool, an awareness course, finding new meaning in clinical practice, a technologic tool, a sudden insight, a mentor observation, and a physician-as-patient experience. These stories illustrate how communication skills can be learned through various activities and experiences that transform physicians into those who are highly successful communicators. All modes result in a change of state-a new way of seeing, of being-and are not just a new tool or a new practice, but a change in state of mind. This state resulted in a marked change of behavior, and a substantial improvement of communication and relationship.

  4. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    PubMed Central

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  5. Pressure variation of developed lapping tool on surface roughness

    NASA Astrophysics Data System (ADS)

    Hussain, A. K.; Lee, K. Q.; Aung, L. M.; Abu, A.; Tan, L. K.; Kang, H. S.

    2018-01-01

    Improving the surface roughness is always one of the major concerns in the development of lapping process as high precision machining caters a great demand in manufacturing process. This paper aims to investigate the performance of a newly designed lapping tool in term of surface roughness. Polypropylene is used as the lapping tool head. The lapping tool is tested for different pressure to identify the optimum working pressure for lapping process. The theoretical surface roughness is also calculated using Vickers Hardness. The present study shows that polypropylene is able to produce good quality and smooth surface roughness. The optimum lapping pressure in the present study is found to be 45 MPa. By comparing the theoretical and experimental values, the present study shows that the newly designed lapping tool is capable to produce finer surface roughness.

  6. 1999 NASA High-Speed Research Program Aerodynamic Performance Workshop. Volume 2; High Lift

    NASA Technical Reports Server (NTRS)

    Hahne, David E. (Editor)

    1999-01-01

    NASA's High-Speed Research Program sponsored the 1999 Aerodynamic Performance Technical Review on February 8-12, 1999 in Anaheim, California. The review was designed to bring together NASA and industry High-Speed Civil Transport (HSCT) Aerodynamic Performance technology development participants in the areas of Configuration Aerodynamics (transonic and supersonic cruise drag prediction and minimization), High Lift, and Flight Controls. The review objectives were to (1) report the progress and status of HSCT aerodynamic performance technology development; (2) disseminate this technology within the appropriate technical communities; and (3) promote synergy among die scientists and engineers working on HSCT aerodynamics. In particular, single and midpoint optimized HSCT configurations, HSCT high-lift system performance predictions, and HSCT simulation results were presented, along with executive summaries for all the Aerodynamic Performance technology areas. The HSR Aerodynamic Performance Technical Review was held simultaneously with the annual review of the following airframe technology areas: Materials and Structures, Environmental Impact, Flight Deck, and Technology Integration. Thus, a fourth objective of the Review was to promote synergy between the Aerodynamic Performance technology area and the other technology areas of the HSR Program. This Volume 2/Part 2 publication covers the tools and methods development session.

  7. A Study on Developing "An Attitude Scale for Project and Performance Tasks for Turkish Language Leaching Course"

    ERIC Educational Resources Information Center

    Demir, Tazegul

    2013-01-01

    The main purpose of this study is to demonstrate the students' attitudes towards project and performance tasks in Turkish Lessons and to develop a reliable and valid measurement tool. A total of 461 junior high school students participated in this study. In this study, firstly the preparation of items, specialist be consulted (content…

  8. Validation and learning in the Procedicus KSA virtual reality surgical simulator.

    PubMed

    Ström, P; Kjellin, A; Hedman, L; Johnson, E; Wredmark, T; Felländer-Tsai, L

    2003-02-01

    Advanced simulator training within medicine is a rapidly growing field. Virtual reality simulators are being introduced as cost-saving educational tools, which also lead to increased patient safety. Fifteen medical students were included in the study. For 10 medical students performance was monitored, before and after 1 h of training, in two endoscopic simulators (the Procedicus KSA with haptic feedback and anatomical graphics and the established MIST simulator without this haptic feedback and graphics). Five medical students performed 50 tests in the Procedicus KSA in order to analyze learning curves. One of these five medical students performed multiple training sessions during 2 weeks and performed more than 300 tests. There was a significant improvement after 1 h of training regarding time, movement economy, and total score. The results in the two simulators were highly correlated. Our results show that the use of surgical simulators as a pedagogical tool in medical student training is encouraging. It shows rapid learning curves and our suggestion is to introduce endoscopic simulator training in undergraduate medical education during the course in surgery when motivation is high and before the development of "negative stereotypes" and incorrect practices.

  9. Vega-Constellation Tools to Analize Hyperspectral Images

    NASA Astrophysics Data System (ADS)

    Savorskiy, V.; Loupian, E.; Balashov, I.; Kashnitskii, A.; Konstantinova, A.; Tolpin, V.; Uvarov, I.; Kuznetsov, O.; Maklakov, S.; Panova, O.; Savchenko, E.

    2016-06-01

    Creating high-performance means to manage massive hyperspectral data (HSD) arrays is an actual challenge when it is implemented to deal with disparate information resources. Aiming to solve this problem the present work develops tools to work with HSD in a distributed information infrastructure, i.e. primarily to use those tools in remote access mode. The main feature of presented approach is in the development of remotely accessed services, which allow users both to conduct search and retrieval procedures on HSD sets and to provide target users with tools to analyze and to process HSD in remote mode. These services were implemented within VEGA-Constellation family information systems that were extended by adding tools oriented to support the studies of certain classes of natural objects by exploring their HSD. Particular developed tools provide capabilities to conduct analysis of such objects as vegetation canopies (forest and agriculture), open soils, forest fires, and areas of thermal anomalies. Developed software tools were successfully tested on Hyperion data sets.

  10. Structural considerations for fabrication and mounting of the AXAF HRMA optics

    NASA Technical Reports Server (NTRS)

    Cohen, Lester M.; Cernoch, Larry; Mathews, Gary; Stallcup, Michael

    1990-01-01

    A methodology is described which minimizes optics distortion in the fabrication, metrology, and launch configuration phases. The significance of finite element modeling and breadboard testing is described with respect to performance analyses of support structures and material effects in NASA's AXAF X-ray optics. The paper outlines the requirements for AXAF performance, optical fabrication, metrology, and glass support fixtures, as well as the specifications for mirror sensitivity and the high-resolution mirror assembly. Analytical modeling of the tools is shown to coincide with grinding and polishing experiments, and is useful for designing large-area polishing and grinding tools. Metrological subcomponents that have undergone initial testing show evidence of meeting force requirements.

  11. Drilling High Precision Holes in Ti6Al4V Using Rotary Ultrasonic Machining and Uncertainties Underlying Cutting Force, Tool Wear, and Production Inaccuracies.

    PubMed

    Chowdhury, M A K; Sharif Ullah, A M M; Anwar, Saqib

    2017-09-12

    Ti6Al4V alloys are difficult-to-cut materials that have extensive applications in the automotive and aerospace industry. A great deal of effort has been made to develop and improve the machining operations of Ti6Al4V alloys. This paper presents an experimental study that systematically analyzes the effects of the machining conditions (ultrasonic power, feed rate, spindle speed, and tool diameter) on the performance parameters (cutting force, tool wear, overcut error, and cylindricity error), while drilling high precision holes on the workpiece made of Ti6Al4V alloys using rotary ultrasonic machining (RUM). Numerical results were obtained by conducting experiments following the design of an experiment procedure. The effects of the machining conditions on each performance parameter have been determined by constructing a set of possibility distributions (i.e., trapezoidal fuzzy numbers) from the experimental data. A possibility distribution is a probability-distribution-neural representation of uncertainty, and is effective in quantifying the uncertainty underlying physical quantities when there is a limited number of data points which is the case here. Lastly, the optimal machining conditions have been identified using these possibility distributions.

  12. Development of an interprofessional lean facilitator assessment scale.

    PubMed

    Bravo-Sanchez, Cindy; Dorazio, Vincent; Denmark, Robert; Heuer, Albert J; Parrott, J Scott

    2018-05-01

    High reliability is important for optimising quality and safety in healthcare organisations. Reliability efforts include interprofessional collaborative practice (IPCP) and Lean quality/process improvement strategies, which require skilful facilitation. Currently, no validated Lean facilitator assessment tool for interprofessional collaboration exists. This article describes the development and pilot evaluation of such a tool; the Interprofessional Lean Facilitator Assessment Scale (ILFAS), which measures both technical and 'soft' skills, which have not been measured in other instruments. The ILFAS was developed using methodologies and principles from Lean/Shingo, IPCP, metacognition research and Bloom's Taxonomy of Learning Domains. A panel of experts confirmed the initial face validity of the instrument. Researchers independently assessed five facilitators, during six Lean sessions. Analysis included quantitative evaluation of rater agreement. Overall inter-rater agreement of the assessment of facilitator performance was high (92%), and discrepancies in the agreement statistics were analysed. Face and content validity were further established, and usability was evaluated, through primary stakeholder post-pilot feedback, uncovering minor concerns, leading to tool revision. The ILFAS appears comprehensive in the assessment of facilitator knowledge, skills, abilities, and may be useful in the discrimination between facilitators of different skill levels. Further study is needed to explore instrument performance and validity.

  13. The Nanoelectric Modeling Tool (NEMO) and Its Expansion to High Performance Parallel Computing

    NASA Technical Reports Server (NTRS)

    Klimeck, G.; Bowen, C.; Boykin, T.; Oyafuso, F.; Salazar-Lazaro, C.; Stoica, A.; Cwik, T.

    1998-01-01

    Material variations on an atomic scale enable the quantum mechanical functionality of devices such as resonant tunneling diodes (RTDs), quantum well infrared photodetectors (QWIPs), quantum well lasers, and heterostructure field effect transistors (HFETs).

  14. Integrated corridor management modeling results report : Dallas, Minneapolis, and San Diego.

    DOT National Transportation Integrated Search

    2012-02-01

    This executive summary documents the analysis methodologies, tools, and performance measures used to analyze Integrated Corridor Management (ICM) strategies; and presents high-level results for the successful implementation of ICM at three Stage 2 Pi...

  15. The Effect of Color Choice on Learner Interpretation of a Cosmology Visualization

    ERIC Educational Resources Information Center

    Buck, Zoe

    2013-01-01

    As we turn more and more to high-end computing to understand the Universe at cosmological scales, dynamic visualizations of simulations will take on a vital role as perceptual and cognitive tools. In collaboration with the Adler Planetarium and University of California High-Performance AstroComputing Center (UC-HiPACC), I am interested in better…

  16. Development and validation of a web-based questionnaire for surveying the health and working conditions of high-performance marine craft populations.

    PubMed

    de Alwis, Manudul Pahansen; Lo Martire, Riccardo; Äng, Björn O; Garme, Karl

    2016-06-20

    High-performance marine craft crews are susceptible to various adverse health conditions caused by multiple interactive factors. However, there are limited epidemiological data available for assessment of working conditions at sea. Although questionnaire surveys are widely used for identifying exposures, outcomes and associated risks with high accuracy levels, until now, no validated epidemiological tool exists for surveying occupational health and performance in these populations. To develop and validate a web-based questionnaire for epidemiological assessment of occupational and individual risk exposure pertinent to the musculoskeletal health conditions and performance in high-performance marine craft populations. A questionnaire for investigating the association between work-related exposure, performance and health was initially developed by a consensus panel under four subdomains, viz. demography, lifestyle, work exposure and health and systematically validated by expert raters for content relevance and simplicity in three consecutive stages, each iteratively followed by a consensus panel revision. The item content validity index (I-CVI) was determined as the proportion of experts giving a rating of 3 or 4. The scale content validity index (S-CVI/Ave) was computed by averaging the I-CVIs for the assessment of the questionnaire as a tool. Finally, the questionnaire was pilot tested. The S-CVI/Ave increased from 0.89 to 0.96 for relevance and from 0.76 to 0.94 for simplicity, resulting in 36 items in the final questionnaire. The pilot test confirmed the feasibility of the questionnaire. The present study shows that the web-based questionnaire fulfils previously published validity acceptance criteria and is therefore considered valid and feasible for the empirical surveying of epidemiological aspects among high-performance marine craft crews and similar populations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  17. A Comparative Study of Interval Management Control Law Capabilities

    NASA Technical Reports Server (NTRS)

    Barmore, Bryan E.; Smith, Colin L.; Palmer, Susan O.; Abbott, Terence S.

    2012-01-01

    This paper presents a new tool designed to allow for rapid development and testing of different control algorithms for airborne spacing. This tool, Interval Management Modeling and Spacing Tool (IM MAST), is a fast-time, low-fidelity tool created to model the approach of aircraft to a runway, with a focus on their interactions with each other. Errors can be induced between pairs of aircraft by varying initial positions, winds, speed profiles, and altitude profiles. Results to-date show that only a few of the algorithms tested had poor behavior in the arrival and approach environment. The majority of the algorithms showed only minimal variation in performance under the test conditions. Trajectory-based algorithms showed high susceptibility to wind forecast errors, while performing marginally better than the other algorithms under other conditions. Trajectory-based algorithms have a sizable advantage, however, of being able to perform relative spacing operations between aircraft on different arrival routes and flight profiles without employing ghosting. methods. This comes at the higher cost of substantially increased complexity, however. Additionally, it was shown that earlier initiation of relative spacing operations provided more time for corrections to be made without any significant problems in the spacing operation itself. Initiating spacing farther out, however, would require more of the aircraft to begin spacing before they merge onto a common route.

  18. Highly scalable parallel processing of extracellular recordings of Multielectrode Arrays.

    PubMed

    Gehring, Tiago V; Vasilaki, Eleni; Giugliano, Michele

    2015-01-01

    Technological advances of Multielectrode Arrays (MEAs) used for multisite, parallel electrophysiological recordings, lead to an ever increasing amount of raw data being generated. Arrays with hundreds up to a few thousands of electrodes are slowly seeing widespread use and the expectation is that more sophisticated arrays will become available in the near future. In order to process the large data volumes resulting from MEA recordings there is a pressing need for new software tools able to process many data channels in parallel. Here we present a new tool for processing MEA data recordings that makes use of new programming paradigms and recent technology developments to unleash the power of modern highly parallel hardware, such as multi-core CPUs with vector instruction sets or GPGPUs. Our tool builds on and complements existing MEA data analysis packages. It shows high scalability and can be used to speed up some performance critical pre-processing steps such as data filtering and spike detection, helping to make the analysis of larger data sets tractable.

  19. Determination of high-strength materials diamond grinding rational modes

    NASA Astrophysics Data System (ADS)

    Arkhipov, P. V.; Lobanov, D. V.; Rychkov, D. A.; Yanyushkin, A. S.

    2018-03-01

    The analysis of methods of high-strength materials abrasive processing is carried out. This method made it possible to determine the necessary directions and prospects for the development of shaping combined methods. The need to use metal bonded diamond abrasive tools in combination with a different kind of energy is noted to improve the processing efficiency and reduce the complexity of operations. The complex of experimental research on revealing the importance of mechanical and electrical components of cutting regimes, on the cutting ability of diamond tools, as well as the need to reduce the specific consumption of an abrasive wheel as one of the important economic indicators of the processing process is performed. It is established that combined diamond grinding with simultaneous continuous correction of the abrasive wheel contributes to an increase in the cutting ability of metal bonded diamond abrasive tools when processing high-strength materials by an average of 30% compared to diamond grinding. Particular recommendations on the designation of technological factors are developed depending on specific production problems.

  20. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  1. Key Topics for High-Lift Research: A Joint Wind Tunnel/Flight Test Approach

    NASA Technical Reports Server (NTRS)

    Fisher, David; Thomas, Flint O.; Nelson, Robert C.

    1996-01-01

    Future high-lift systems must achieve improved aerodynamic performance with simpler designs that involve fewer elements and reduced maintenance costs. To expeditiously achieve this, reliable CFD design tools are required. The development of useful CFD-based design tools for high lift systems requires increased attention to unresolved flow physics issues. The complex flow field over any multi-element airfoil may be broken down into certain generic component flows which are termed high-lift building block flows. In this report a broad spectrum of key flow field physics issues relevant to the design of improved high lift systems are considered. It is demonstrated that in-flight experiments utilizing the NASA Dryden Flight Test Fixture (which is essentially an instrumented ventral fin) carried on an F-15B support aircraft can provide a novel and cost effective method by which both Reynolds and Mach number effects associated with specific high lift building block flows can be investigated. These in-flight high lift building block flow experiments are most effective when performed in conjunction with coordinated ground based wind tunnel experiments in low speed facilities. For illustrative purposes three specific examples of in-flight high lift building block flow experiments capable of yielding a high payoff are described. The report concludes with a description of a joint wind tunnel/flight test approach to high lift aerodynamics research.

  2. Moisture Performance of High-R Wall Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Nay B.; Kochkin, Vladimir

    High-performance homes offer improved comfort, lower utility bills, and assured durability. The next generation of building enclosures is a key step toward achieving high-performance goals through decreasing energy load demand and enabling advanced space-conditioning systems. Yet the adoption of high-R enclosures and particularly high-R walls has been a slow-growing trend because mainstream builders are hesitant to make the transition. In a survey of builders on this topic, one of the challenges identifi ed is an industry-wide concern about the long-term moisture performance of energy-effi cient walls. This study takes a step toward addressing this concern through direct monitoring of themore » moisture performance of high-R walls in occupied homes in several climate zones. In addition, the robustness of the design and modeling tools for selecting high-R wall solutions is evaluated using the monitored data from the field. The information and knowledge gained through this research will provide an objective basis for decision-making so that builders can implement advanced designs with confidence.« less

  3. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  4. High density plasmas and new diagnostics: An overview (invited).

    PubMed

    Celona, L; Gammino, S; Mascali, D

    2016-02-01

    One of the limiting factors for the full understanding of Electron Cyclotron Resonance Ion Sources (ECRISs) fundamental mechanisms consists of few types of diagnostic tools so far available for such compact machines. Microwave-to-plasma coupling optimisation, new methods of density overboost provided by plasma wave generation, and magnetostatic field tailoring for generating a proper electron energy distribution function, suitable for optimal ion beams formation, require diagnostic tools spanning across the entire electromagnetic spectrum from microwave interferometry to X-ray spectroscopy; these methods are going to be implemented including high resolution and spatially resolved X-ray spectroscopy made by quasi-optical methods (pin-hole cameras). The ion confinement optimisation also requires a complete control of cold electrons displacement, which can be performed by optical emission spectroscopy. Several diagnostic tools have been recently developed at INFN-LNS, including "volume-integrated" X-ray spectroscopy in low energy domain (2-30 keV, by using silicon drift detectors) or high energy regime (>30 keV, by using high purity germanium detectors). For the direct detection of the spatially resolved spectral distribution of X-rays produced by the electronic motion, a "pin-hole camera" has been developed also taking profit from previous experiences in the ECRIS field. The paper will give an overview of INFN-LNS strategy in terms of new microwave-to-plasma coupling schemes and advanced diagnostics supporting the design of new ion sources and for optimizing the performances of the existing ones, with the goal of a microwave-absorption oriented design of future machines.

  5. Technology development plan: Geotechnical survey systems for OTEC (Ocean Thermal Energy Conversion) cold water pipes

    NASA Astrophysics Data System (ADS)

    Valent, Philip J.; Riggins, Michael

    1989-04-01

    An overview is given of current and developing technologies and techniques for performing geotechnical investigations for siting and designing Cold Water Pipes (CWP) for shelf-resting Ocean Thermal Energy Conversion (OTEC) power plants. The geotechnical in situ tools used to measure the required parameters and the equipment/systems used to deploy these tools are identified. The capabilities of these geotechnical tools and deployment systems are compared to the data requirements for the CWP foundation/anchor design, and shortfalls are identified. For the last phase of geotechnical data gathering for design, a drillship will be required to perform soil boring work, to obtain required high quality sediment samples for laboratory dynamic testing, and to perform deep penetration in situ tests. To remedy shortfalls and to reduce the future OTEC CWP geotechnical survey costs, it is recommended that a seafloor resting machine be developed to advance the friction cone penetrometer, and also probably a pressuremeter, to provide geotechnical parameters to shallow subseafloor penetrations on slopes of 35 deg and in water depths to 1300 m.

  6. A model of motor performance during surface penetration: from physics to voluntary control.

    PubMed

    Klatzky, Roberta L; Gershon, Pnina; Shivaprabhu, Vikas; Lee, Randy; Wu, Bing; Stetten, George; Swendsen, Robert H

    2013-10-01

    The act of puncturing a surface with a hand-held tool is a ubiquitous but complex motor behavior that requires precise force control to avoid potentially severe consequences. We present a detailed model of puncture over a time course of approximately 1,000 ms, which is fit to kinematic data from individual punctures, obtained via a simulation with high-fidelity force feedback. The model describes puncture as proceeding from purely physically determined interactions between the surface and tool, through decline of force due to biomechanical viscosity, to cortically mediated voluntary control. When fit to the data, it yields parameters for the inertial mass of the tool/person coupling, time characteristic of force decline, onset of active braking, stopping time and distance, and late oscillatory behavior, all of which the analysis relates to physical variables manipulated in the simulation. While the present data characterize distinct phases of motor performance in a group of healthy young adults, the approach could potentially be extended to quantify the performance of individuals from other populations, e.g., with sensory-motor impairments. Applications to surgical force control devices are also considered.

  7. Performance index: An expeditious tool to screen for improved drought resistance in the Lathyrus genus.

    PubMed

    Silvestre, Susana; Araújo, Susana de Sousa; Vaz Patto, Maria Carlota; Marques da Silva, Jorge

    2014-07-01

    Some species of the Lathyrus genus are among the most promising crops for marginal lands, with high resilience to drought, flood, and fungal diseases, combined with high yields and seed nutritional value. However, lack of knowledge on the mechanisms underlying its outstanding performance and methodologies to identify elite genotypes has hampered its proper use in breeding. Chlorophyll a fast fluorescence transient (JIP test), was used to evaluate water deficit (WD) resistance in Lathyrus genus. Our results reveal unaltered photochemical values for all studied genotypes showing resistance to mild WD. Under severe WD, two Lathyrus sativus genotypes showed remarkable resilience maintaining the photochemical efficiency, contrary to other genotypes studied. Performance index (PIABS) is the best parameter to screen genotypes with improved performance and grain production under WD. Moreover, we found that JIP indices are good indicators of genotypic grain production under WD. Quantum yield of electron transport (ϕEo) and efficiency with which trapped excitons can move electrons further than QA (ψ0) revealed as important traits related to improved photosynthetic performance and should be exploited in future Lathyrus germplasm improvements. The JIP test herein described showed to be an expeditious tool to screen and to identify elite genotypes with improved drought resistance.

  8. SU-F-T-459: ArcCHECK Machine QA : Highly Efficient Quality Assurance Tool for VMAT, SRS & SBRT Linear Accelerator Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhatre, V; Patwe, P; Dandekar, P

    Purpose: Quality assurance (QA) of complex linear accelerators is critical and highly time consuming. ArcCHECK Machine QA tool is used to test geometric and delivery aspects of linear accelerator. In this study we evaluated the performance of this tool. Methods: Machine QA feature allows user to perform quality assurance tests using ArcCHECK phantom. Following tests were performed 1) Gantry Speed 2) Gantry Rotation 3) Gantry Angle 4)MLC/Collimator QA 5)Beam Profile Flatness & Symmetry. Data was collected on trueBEAM stX machine for 6 MV for a period of one year. The Gantry QA test allows to view errors in gantry angle,more » rotation & assess how accurately the gantry moves around the isocentre. The MLC/Collimator QA tool is used to analyze & locate the differences between leaf bank & jaw position of linac. The flatness & Symmetry test quantifies beam flatness & symmetry in IEC-y & x direction. The Gantry & Flatness/Symmetry test can be performed for static & dynamic delivery. Results: The Gantry speed was 3.9 deg/sec with speed maximum deviation around 0.3 deg/sec. The Gantry Isocentre for arc delivery was 0.9mm & static delivery was 0.4mm. The maximum percent positive & negative difference was found to be 1.9 % & – 0.25 % & maximum distance positive & negative diff was 0.4mm & – 0.3 mm for MLC/Collimator QA. The Flatness for Arc delivery was 1.8 % & Symmetry for Y was 0.8 % & X was 1.8 %. The Flatness for gantry 0°,270°,90° & 180° was 1.75,1.9,1.8 & 1.6% respectively & Symmetry for X & Y was 0.8,0.6% for 0°, 0.6,0.7% for 270°, 0.6,1% for 90° & 0.6,0.7% for 180°. Conclusion: ArcCHECK Machine QA is an useful tool for QA of Modern linear accelerators as it tests both geometric & delivery aspects. This is very important for VMAT, SRS & SBRT treatments.« less

  9. Tribological performances of new steel grades for hot stamping tools

    NASA Astrophysics Data System (ADS)

    Medea, F.; Venturato, G.; Ghiotti, A.; Bruschi, S.

    2017-09-01

    In the last years, the use of High Strength Steels (HSS) as structural parts in car body-in-white manufacturing has rapidly increased thanks to their favourable strength-to-weight ratio and stiffness, which allow a reduction of the fuel consumption to accommodate the new restricted regulations for CO2 emissions control. The survey of the technical and scientific literature shows a large interest in the development of different coatings for the blanks from the traditional Al-Si up to new Zn-based coatings and on the analysis of hard PVD, CVD coatings and plasma nitriding applied on the tools. By contrast, fewer investigations have been focused on the development and test of new tools steels grades capable to improve the wear resistance and the thermal properties that are required for the in-die quenching during forming. On this base, the paper deals with the analysis and comparison the tribological performances in terms of wear, friction and heat transfer of new tool steel grades for high-temperature applications, characterized by a higher thermal conductivity than the commonly used tools. Testing equipment, procedures as well as measurements analyses to evaluate the friction coefficient, the wear and heat transfer phenomena are presented. Emphasis is given on the physical simulation techniques that were specifically developed to reproduce the thermal and mechanical cycles on the metal sheets and dies as in the industrial practice. The reference industrial process is the direct hot stamping of the 22MnB5 HSS coated with the common Al-Si coating for automotive applications.

  10. Spinoff 2011

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Topics include: Bioreactors Drive Advances in Tissue Engineering; Tooling Techniques Enhance Medical Imaging; Ventilator Technologies Sustain Critically Injured Patients; Protein Innovations Advance Drug Treatments, Skin Care; Mass Analyzers Facilitate Research on Addiction; Frameworks Coordinate Scientific Data Management; Cameras Improve Navigation for Pilots, Drivers; Integrated Design Tools Reduce Risk, Cost; Advisory Systems Save Time, Fuel for Airlines; Modeling Programs Increase Aircraft Design Safety; Fly-by-Wire Systems Enable Safer, More Efficient Flight; Modified Fittings Enhance Industrial Safety; Simulation Tools Model Icing for Aircraft Design; Information Systems Coordinate Emergency Management; Imaging Systems Provide Maps for U.S. Soldiers; High-Pressure Systems Suppress Fires in Seconds; Alloy-Enhanced Fans Maintain Fresh Air in Tunnels; Control Algorithms Charge Batteries Faster; Software Programs Derive Measurements from Photographs; Retrofits Convert Gas Vehicles into Hybrids; NASA Missions Inspire Online Video Games; Monitors Track Vital Signs for Fitness and Safety; Thermal Components Boost Performance of HVAC Systems; World Wind Tools Reveal Environmental Change; Analyzers Measure Greenhouse Gasses, Airborne Pollutants; Remediation Technologies Eliminate Contaminants; Receivers Gather Data for Climate, Weather Prediction; Coating Processes Boost Performance of Solar Cells; Analyzers Provide Water Security in Space and on Earth; Catalyst Substrates Remove Contaminants, Produce Fuel; Rocket Engine Innovations Advance Clean Energy; Technologies Render Views of Earth for Virtual Navigation; Content Platforms Meet Data Storage, Retrieval Needs; Tools Ensure Reliability of Critical Software; Electronic Handbooks Simplify Process Management; Software Innovations Speed Scientific Computing; Controller Chips Preserve Microprocessor Function; Nanotube Production Devices Expand Research Capabilities; Custom Machines Advance Composite Manufacturing; Polyimide Foams Offer Superior Insulation; Beam Steering Devices Reduce Payload Weight; Models Support Energy-Saving Microwave Technologies; Materials Advance Chemical Propulsion Technology; and High-Temperature Coatings Offer Energy Savings.

  11. Alkemio: association of chemicals with biomedical topics by text and data mining

    PubMed Central

    Gijón-Correas, José A.; Andrade-Navarro, Miguel A.; Fontaine, Jean F.

    2014-01-01

    The PubMed® database of biomedical citations allows the retrieval of scientific articles studying the function of chemicals in biology and medicine. Mining millions of available citations to search reported associations between chemicals and topics of interest would require substantial human time. We have implemented the Alkemio text mining web tool and SOAP web service to help in this task. The tool uses biomedical articles discussing chemicals (including drugs), predicts their relatedness to the query topic with a naïve Bayesian classifier and ranks all chemicals by P-values computed from random simulations. Benchmarks on seven human pathways showed good retrieval performance (areas under the receiver operating characteristic curves ranged from 73.6 to 94.5%). Comparison with existing tools to retrieve chemicals associated to eight diseases showed the higher precision and recall of Alkemio when considering the top 10 candidate chemicals. Alkemio is a high performing web tool ranking chemicals for any biomedical topics and it is free to non-commercial users. Availability: http://cbdm.mdc-berlin.de/∼medlineranker/cms/alkemio. PMID:24838570

  12. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  13. Automation bias: decision making and performance in high-tech cockpits.

    PubMed

    Mosier, K L; Skitka, L J; Heers, S; Burdick, M

    1997-01-01

    Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.

  14. Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools

    NASA Technical Reports Server (NTRS)

    Orr, Stanley A.; Narducci, Robert P.

    2009-01-01

    A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.

  15. Predicting the Consequences of Workload Management Strategies with Human Performance Modeling

    NASA Technical Reports Server (NTRS)

    Mitchell, Diane Kuhl; Samma, Charneta

    2011-01-01

    Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.

  16. A high power ion thruster for deep space missions

    NASA Astrophysics Data System (ADS)

    Polk, James E.; Goebel, Dan M.; Snyder, John S.; Schneider, Analyn C.; Johnson, Lee K.; Sengupta, Anita

    2012-07-01

    The Nuclear Electric Xenon Ion System ion thruster was developed for potential outer planet robotic missions using nuclear electric propulsion (NEP). This engine was designed to operate at power levels ranging from 13 to 28 kW at specific impulses of 6000-8500 s and for burn times of up to 10 years. State-of-the-art performance and life assessment tools were used to design the thruster, which featured 57-cm-diameter carbon-carbon composite grids operating at voltages of 3.5-6.5 kV. Preliminary validation of the thruster performance was accomplished with a laboratory model thruster, while in parallel, a flight-like development model (DM) thruster was completed and two DM thrusters fabricated. The first thruster completed full performance testing and a 2000-h wear test. The second successfully completed vibration tests at the full protoflight levels defined for this NEP program and then passed performance validation testing. The thruster design, performance, and the experimental validation of the design tools are discussed in this paper.

  17. A high power ion thruster for deep space missions.

    PubMed

    Polk, James E; Goebel, Dan M; Snyder, John S; Schneider, Analyn C; Johnson, Lee K; Sengupta, Anita

    2012-07-01

    The Nuclear Electric Xenon Ion System ion thruster was developed for potential outer planet robotic missions using nuclear electric propulsion (NEP). This engine was designed to operate at power levels ranging from 13 to 28 kW at specific impulses of 6000-8500 s and for burn times of up to 10 years. State-of-the-art performance and life assessment tools were used to design the thruster, which featured 57-cm-diameter carbon-carbon composite grids operating at voltages of 3.5-6.5 kV. Preliminary validation of the thruster performance was accomplished with a laboratory model thruster, while in parallel, a flight-like development model (DM) thruster was completed and two DM thrusters fabricated. The first thruster completed full performance testing and a 2000-h wear test. The second successfully completed vibration tests at the full protoflight levels defined for this NEP program and then passed performance validation testing. The thruster design, performance, and the experimental validation of the design tools are discussed in this paper.

  18. A Computational Framework for Efficient Low Temperature Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

    2016-10-01

    Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

  19. ENVI-PV: An Interactive Web Client for Multi-Criteria Life Cycle Assessment of Photovoltaic Systems Worldwide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Lopez, Paula; Gschwind, Benoit; Blanc, Philippe

    Solar photovoltaics (PV) is the second largest source of new capacity among renewable energies. The worldwide capacity encompassed 135 GW in 2013 and is estimated to increase to 1721 GW in 2030 and 4674 GW in 2050, according to a prospective high-renewable scenario. To achieve this production level while minimizing environmental impacts, decision makers must have access to environmental performance data that reflect their high spatial variability accurately. We propose ENVI-PV (http://viewer.webservice-energy.org/project_iea), a new interactive tool that provides maps and screening level data, based on weighted average supply chains, for the environmental performance of common PV technologies. Environmental impacts ofmore » PV systems are evaluated according to a life cycle assessment approach. ENVI-PV was developed using a state-of-the-art interoperable and open standard Web Service framework from the Open Geospatial Consortium (OGC). It combines the latest life cycle inventories, published in 2015 by the International Energy Agency (IEA) under the Photovoltaic Power Systems Program (PVPS) Task 12, and some inventories previously published from Ecoinvent v2.2 database with solar irradiation estimates computed from the worldwide NASA SSE database. ENVI-PV is the first tool to propose a worldwide coverage of environmental performance of PV systems using a multi-criteria assessment. The user can compare the PV environmental performance to the environmental footprint of country electricity mixes. ENVI-PV is designed as an environmental interactive tool to generate PV technological options and evaluate their performance in different spatial and techno-economic contexts. Its potential applications are illustrated in this paper with several examples.« less

  20. Blinded evaluation of interrater reliability of an operative competency assessment tool for direct laryngoscopy and rigid bronchoscopy.

    PubMed

    Ishman, Stacey L; Benke, James R; Johnson, Kaalan Erik; Zur, Karen B; Jacobs, Ian N; Thorne, Marc C; Brown, David J; Lin, Sandra Y; Bhatti, Nasir; Deutsch, Ellen S

    2012-10-01

    OBJECTIVES To confirm interrater reliability using blinded evaluation of a skills-assessment instrument to assess the surgical performance of resident and fellow trainees performing pediatric direct laryngoscopy and rigid bronchoscopy in simulated models. DESIGN Prospective, paired, blinded observational validation study. SUBJECTS Paired observers from multiple institutions simultaneously evaluated residents and fellows who were performing surgery in an animal laboratory or using high-fidelity manikins. The evaluators had no previous affiliation with the residents and fellows and did not know their year of training. INTERVENTIONS One- and 2-page versions of an objective structured assessment of technical skills (OSATS) assessment instrument composed of global and a task-specific surgical items were used to evaluate surgical performance. RESULTS Fifty-two evaluations were completed by 17 attending evaluators. The instrument agreement for the 2-page assessment was 71.4% when measured as a binary variable (ie, competent vs not competent) (κ = 0.38; P = .08). Evaluation as a continuous variable revealed a 42.9% percentage agreement (κ = 0.18; P = .14). The intraclass correlation was 0.53, considered substantial/good interrater reliability (69% reliable). For the 1-page instrument, agreement was 77.4% when measured as a binary variable (κ = 0.53, P = .0015). Agreement when evaluated as a continuous measure was 71.0% (κ = 0.54, P < .001). The intraclass correlation was 0.73, considered high interrater reliability (85% reliable). CONCLUSIONS The OSATS assessment instrument is an effective tool for evaluating surgical performance among trainees with acceptable interrater reliability in a simulator setting. Reliability was good for both the 1- and 2-page OSATS checklists, and both serve as excellent tools to provide immediate formative feedback on operational competency.

  1. A quality assurance device for measuring afterloader performance and transit dose for nasobiliary high-dose-rate brachytherapy.

    PubMed

    Deufel, Christopher L; Mullins, John P; Zakhary, Mark J

    2018-05-17

    Nasobiliary high-dose-rate (HDR) brachytherapy has emerged as an effective tool to boost the radiation dose for patients with unresectable perihilar cholangiocarcinoma. This work describes a quality assurance (QA) tool for measuring the HDR afterloader's performance, including the transit dose, when the source wire travels through a tortuous nasobiliary catheter path. The nasobiliary QA device was designed to mimic the anatomical path of a nasobiliary catheter, including the nasal, stomach, duodenum, and bile duct loops. Two of these loops, the duodenum and bile duct loops, have adjustable radii of curvature, resulting in the ability to maximize stress on the source wire in transit. The device was used to measure the performance over time for the HDR afterloader and the differences between intraluminal catheter lots. An upper limit on the transit dose was also measured using radiochromic film and compared with a simple theoretical model. The QA device was capable of detecting performance variations among nasobiliary catheter lots and following radioactive source replacement. The transit dose from a nasobiliary treatment increased by up to one order of magnitude when the source wire encountered higher than normal friction. Three distinct travel speeds of the source wire were observed: 5.2, 17.4, and 54.7 cm/s. The maximum transit dose was 0.3 Gy at a radial distance of 5 mm from a 40.3 kU 192 Ir source. The source wire encounters substantially greater friction when it navigates through the nasobiliary brachytherapy catheter. A QA tool that mimics the nasal, stomach, duodenum, and bile duct loops may be used to evaluate transit dose and the afterloader's performance over time. Copyright © 2018 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  2. Advancements in Large-Scale Data/Metadata Management for Scientific Data.

    NASA Astrophysics Data System (ADS)

    Guntupally, K.; Devarakonda, R.; Palanisamy, G.; Frame, M. T.

    2017-12-01

    Scientific data often comes with complex and diverse metadata which are critical for data discovery and users. The Online Metadata Editor (OME) tool, which was developed by an Oak Ridge National Laboratory team, effectively manages diverse scientific datasets across several federal data centers, such as DOE's Atmospheric Radiation Measurement (ARM) Data Center and USGS's Core Science Analytics, Synthesis, and Libraries (CSAS&L) project. This presentation will focus mainly on recent developments and future strategies for refining OME tool within these centers. The ARM OME is a standard based tool (https://www.archive.arm.gov/armome) that allows scientists to create and maintain metadata about their data products. The tool has been improved with new workflows that help metadata coordinators and submitting investigators to submit and review their data more efficiently. The ARM Data Center's newly upgraded Data Discovery Tool (http://www.archive.arm.gov/discovery) uses rich metadata generated by the OME to enable search and discovery of thousands of datasets, while also providing a citation generator and modern order-delivery techniques like Globus (using GridFTP), Dropbox and THREDDS. The Data Discovery Tool also supports incremental indexing, which allows users to find new data as and when they are added. The USGS CSAS&L search catalog employs a custom version of the OME (https://www1.usgs.gov/csas/ome), which has been upgraded with high-level Federal Geographic Data Committee (FGDC) validations and the ability to reserve and mint Digital Object Identifiers (DOIs). The USGS's Science Data Catalog (SDC) (https://data.usgs.gov/datacatalog) allows users to discover a myriad of science data holdings through a web portal. Recent major upgrades to the SDC and ARM Data Discovery Tool include improved harvesting performance and migration using new search software, such as Apache Solr 6.0 for serving up data/metadata to scientific communities. Our presentation will highlight the future enhancements of these tools which enable users to retrieve fast search results, along with parallelizing the retrieval process from online and High Performance Storage Systems. In addition, these improvements to the tools will support additional metadata formats like the Large-Eddy Simulation (LES) ARM Symbiotic and Observation (LASSO) bundle data.

  3. Toward High-Performance Communications Interfaces for Science Problem Solving

    NASA Astrophysics Data System (ADS)

    Oviatt, Sharon L.; Cohen, Adrienne O.

    2010-12-01

    From a theoretical viewpoint, educational interfaces that facilitate communicative actions involving representations central to a domain can maximize students' effort associated with constructing new schemas. In addition, interfaces that minimize working memory demands due to the interface per se, for example by mimicking existing non-digital work practice, can preserve students' attentional focus on their learning task. In this research, we asked the question: What type of interface input capabilities provide best support for science problem solving in both low- and high- performing students? High school students' ability to solve a diverse range of biology problems was compared over longitudinal sessions while they used: (1) hardcopy paper and pencil (2) a digital paper and pen interface (3) pen tablet interface, and (4) graphical tablet interface. Post-test evaluations revealed that time to solve problems, meta-cognitive control, solution correctness, and memory all were significantly enhanced when using the digital pen and paper interface, compared with tablet interfaces. The tangible pen and paper interface also was the only alternative that significantly facilitated skill acquisition in low-performing students. Paradoxically, all students nonetheless believed that the tablet interfaces provided best support for their performance, revealing a lack of self-awareness about how to use computational tools to best advantage. Implications are discussed for how pen interfaces can be optimized for future educational purposes, and for establishing technology fluency curricula to improve students' awareness of the impact of digital tools on their performance.

  4. Airport Noise Tech Challenge Overview

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2011-01-01

    The Supersonics Project, operating under NASA Aeronautics Mission Directorate#s Fundamental Aero Program, has been organized around the Technical Challenges that have historically precluded commercial supersonic flight. One of these Challenges is making aircraft that are capable of such high aerodynamic performance quiet enough around airports that they will not be objectionable. It is recognized that a successful civilian supersonic aircraft will be a system where many new technologies will come together, and for this to happen not only will new low noise propulsion concepts be required, but new engineering tools that predict the noise of the aircraft as these technologies are combined and compromised with the rest of the aircraft design. These are the two main objectives of the Airport Noise Tech Challenge. " ! As a Project in the Fundamental Aero Program, we work at a relatively low level of technology readiness. However, we have high level milestones which force us to integrate our efforts to impact systems-level activities. To keep the low-level work tied to delivering engineering tools and low-noise concepts, we have structured our milestones around development of the concepts and organized our activities around developing and applying our engineering tools to these concepts. The final deliverables in these milestones are noise prediction modules validated against the best embodiment of each concept. These will then be used in cross-disciplinary exercises to demonstrate the viability of aircraft designs to meet all the Technical Challenges. Some of the concepts being developed are shown: Fan Flow Diverters, Multi-jet Shielding, High-Aspect Ratio Embedded Nozzles, Plasma Actuated Instability Manipulation, Highly Variable Cycle Mixer- Ejectors, and Inverted Velocity Profiles. These concepts are being developed for reduced jet noise along with the design tools which describe how they perform when used in various aircraft configurations. Several key upcoming events are highlighted, including tests of the Highly Variable Cycle Mixer-Ejectors, and Inverted Velocity Profiles. Other key events are milestones to be delivered within the next calendar year.

  5. Hierarchical Testing with Automated Document Generation for Amanzi, ASCEM's Subsurface Flow and Reactive Transport Simulator

    NASA Astrophysics Data System (ADS)

    Moulton, J. D.; Steefel, C. I.; Yabusaki, S.; Castleton, K.; Scheibe, T. D.; Keating, E. H.; Freedman, V. L.

    2013-12-01

    The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments use a graded and iterative approach, beginning with simplified highly abstracted models, and adding geometric and geologic complexity as understanding is gained. To build confidence in this assessment capability, extensive testing of the underlying tools is needed. Since the tools themselves, such as the subsurface flow and reactive-transport simulator, Amanzi, are under active development, testing must be both hierarchical and highly automated. In this presentation we show how we have met these requirements, by leveraging the python-based open-source documentation system called Sphinx with several other open-source tools. Sphinx builds on the reStructured text tool docutils, with important extensions that include high-quality formatting of equations, and integrated plotting through matplotlib. This allows the documentation, as well as the input files for tests, benchmark and tutorial problems, to be maintained with the source code under a version control system. In addition, it enables developers to build documentation in several different formats (e.g., html and pdf) from a single source. We will highlight these features, and discuss important benefits of this approach for Amanzi. In addition, we'll show that some of ASCEM's other tools, such as the sampling provided by the Uncertainty Quantification toolset, are naturally leveraged to enable more comprehensive testing. Finally, we will highlight the integration of this hiearchical testing and documentation framework with our build system and tools (CMake, CTest, and CDash).

  6. Sub-Second Parallel State Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.

    This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly statemore » estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate effects of severe events on the grid. The power grid continues to grow and the number of measurements is increasing at an accelerated rate due to the variety of smart grid devices being introduced. A parallel state estimation implementation will have better performance than traditional, sequential state estimation by utilizing the power of high performance computing (HPC). This increased performance positions parallel state estimators as valuable tools for operating the increasingly more complex power grid.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lherbier, Louis, W.; Novotnak, David, J.; Herling, Darrell, R.

    Hot forming processes such as forging, die casting and glass forming require tooling that is subjected to high temperatures during the manufacturing of components. Current tooling is adversely affected by prolonged exposure at high temperatures. Initial studies were conducted to determine the root cause of tool failures in a number of applications. Results show that tool failures vary and depend on the operating environment under which they are used. Major root cause failures include (1) thermal softening, (2) fatigue and (3) tool erosion, all of which are affected by process boundary conditions such as lubrication, cooling, process speed, etc. Whilemore » thermal management is a key to addressing tooling failures, it was clear that new tooling materials with superior high temperature strength could provide improved manufacturing efficiencies. These efficiencies are based on the use of functionally graded materials (FGM), a new subset of hybrid tools with customizable properties that can be fabricated using advanced powder metallurgy manufacturing technologies. Modeling studies of the various hot forming processes helped identify the effect of key variables such as stress, temperature and cooling rate and aid in the selection of tooling materials for specific applications. To address the problem of high temperature strength, several advanced powder metallurgy nickel and cobalt based alloys were selected for evaluation. These materials were manufactured into tooling using two relatively new consolidation processes. One process involved laser powder deposition (LPD) and the second involved a solid state dynamic powder consolidation (SSDPC) process. These processes made possible functionally graded materials (FGM) that resulted in shaped tooling that was monolithic, bi-metallic or substrate coated. Manufacturing of tooling with these processes was determined to be robust and consistent for a variety of materials. Prototype and production testing of FGM tooling showed the benefits of the nickel and cobalt based powder metallurgy alloys in a number of applications evaluated. Improvements in tool life ranged from three (3) to twenty (20) or more times than currently used tooling. Improvements were most dramatic where tool softening and deformation were the major cause of tool failures in hot/warm forging applications. Significant improvement was also noted in erosion of aluminum die casting tooling. Cost and energy savings can be realized as a result of increased tooling life, increased productivity and a reduction in scrap because of improved dimensional controls. Although LPD and SSDPC tooling usually have higher acquisition costs, net tooling costs per component produced drops dramatically with superior tool performance. Less energy is used to manufacture the tooling because fewer tools are required and less recycling of used tools are needed for the hot forming process. Energy is saved during the component manufacturing cycle because more parts can be produced in shorter periods of time. Energy is also saved by minimizing heating furnace idling time because of less downtime for tooling changes.« less

  8. Scalable Node Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drotar, Alexander P.; Quinn, Erin E.; Sutherland, Landon D.

    2012-07-30

    Project description is: (1) Build a high performance computer; and (2) Create a tool to monitor node applications in Component Based Tool Framework (CBTF) using code from Lightweight Data Metric Service (LDMS). The importance of this project is that: (1) there is a need a scalable, parallel tool to monitor nodes on clusters; and (2) New LDMS plugins need to be able to be easily added to tool. CBTF stands for Component Based Tool Framework. It's scalable and adjusts to different topologies automatically. It uses MRNet (Multicast/Reduction Network) mechanism for information transport. CBTF is flexible and general enough to bemore » used for any tool that needs to do a task on many nodes. Its components are reusable and 'EASILY' added to a new tool. There are three levels of CBTF: (1) frontend node - interacts with users; (2) filter nodes - filters or concatenates information from backend nodes; and (3) backend nodes - where the actual work of the tool is done. LDMS stands for lightweight data metric servies. It's a tool used for monitoring nodes. Ltool is the name of the tool we derived from LDMS. It's dynamically linked and includes the following components: Vmstat, Meminfo, Procinterrupts and more. It works by: Ltool command is run on the frontend node; Ltool collects information from the backend nodes; backend nodes send information to the filter nodes; and filter nodes concatenate information and send to a database on the front end node. Ltool is a useful tool when it comes to monitoring nodes on a cluster because the overhead involved with running the tool is not particularly high and it will automatically scale to any size cluster.« less

  9. Design of a Cognitive Tool to Enhance Problemsolving Performance

    ERIC Educational Resources Information Center

    Lee, Youngmin; Nelson, David

    2005-01-01

    The design of a cognitive tool to support problem-solving performance for external representation of knowledge is described. The limitations of conventional knowledge maps are analyzed in proposing the tool. The design principles and specifications are described. This tool is expected to enhance learners problem-solving performance by allowing…

  10. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  11. AORN Ergonomic Tool 5: Tissue Retraction in the Perioperative Setting.

    PubMed

    Spera, Patrice; Lloyd, John D; Hernandez, Edward; Hughes, Nancy; Petersen, Carol; Nelson, Audrey; Spratt, Deborah G

    2011-07-01

    Manual retraction, a task performed to expose the surgical site, poses a high risk for musculoskeletal disorders that affect the hands, arms, shoulders, neck, and back. In recent years, minimally invasive and laparoscopic procedures have led to the development of multifunctional instruments and retractors capable of performing these functions that, in many cases, has eliminated the need for manual retraction. During surgical procedures that are not performed endoscopically, the use of self-retaining retractors enables the assistant to handle tissue and use exposure techniques that do not require prolonged manual retraction. Ergonomic Tool #5: Tissue Retraction in the Perioperative Setting provides an algorithm for perioperative care providers to determine when and under what circumstances manual retraction of tissue is safe and when the use of a self-retaining retractor should be considered. Published by Elsevier Inc.

  12. A multisource feedback tool to assess ward round leadership skills of senior paediatric trainees: (2) Testing reliability and practicability.

    PubMed

    Goodyear, Helen M; Lakshminarayana, Indumathy; Wall, David; Bindal, Taruna

    2015-05-01

    A five-domain multisource feedback (MSF) tool was previously developed in 2009-2010 by the authors to assess senior paediatric trainees' ward round leadership skills. To determine whether this MSF tool is practicable and reliable, whether individuals' feedback varies over time and trainees' views of the tool. The MSF tool was piloted (April-July 2011) and field tested (September 2011-February 2013) with senior paediatric trainees. A focus group held at the end of field testing obtained trainees' views of the tool. In field testing, 96/115 (84%) trainees returned 633 individual assessments from three different ward rounds over 18 months. The MSF tool had high reliability (Cronbach's α 0.84, G coefficient 0.8 for three raters). In all five domains, data were shifted to the right with scores of 3 (good) and 4 (excellent). Consultants gave significantly lower scores (p<0.001), as did trainees for self-assessment (p<0.001). There was no significant change in MSF scores over 18 months but comments showed that trainees' performance improved. Trainees valued these comments and the MSF tool but had concerns about time taken for feedback and confusion about tool use and the paediatric assessment strategy. A five-domain MSF tool was found to be reliable on pilot and field testing, practicable to use and liked by trainees. Comments on performance were more helpful than scores in giving trainees feedback. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Clinically relevant lessons from Family HealthLink: a cancer and coronary heart disease familial risk assessment tool.

    PubMed

    Sweet, Kevin; Sturm, Amy C; Rettig, Amy; McElroy, Joseph; Agnese, Doreen

    2015-06-01

    A descriptive retrospective study was performed using two separate user cohorts to determine the effectiveness of Family HealthLink as a clinical triage tool. Cohort 1 consisted of 2,502 users who accessed the public website. Cohort 2 consisted of 194 new patients in a Comprehensive Breast Center setting. For patient users, we assessed documentation of family history and genetics referral. For all users seen in a genetics clinic, the Family HealthLink assessment was compared with that performed by genetic counselors and genetic testing outcomes. For general public users, the percentage meeting high-risk criteria were: for cancer only, 22.2%; for coronary heart disease only, 24.3%; and for both diseases, 10.4%. These risk stratification percentages were similar for the patient users. For the patient users, there often was documentation of family history of certain cancer types by oncology professionals, but age of onset and coronary heart disease family history were less complete. Of 142 with high-risk assignments seen in a genetics clinic, 130 (91.5%) of these assignments were corroborated. Forty-two underwent genetic testing and 17 (40.5%) had new molecular diagnoses established. A significant percentage of individuals are at high familial risk and may require more intensive screening and referral. Interactive family history triage tools can aid this process.Genet Med 17 6, 493-500.

  14. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing

    PubMed Central

    2014-01-01

    Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312

  15. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    PubMed

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  16. A Computable Definition of Sepsis Facilitates Screening and Performance Improvement Tracking.

    PubMed

    Alessi, Lauren J; Warmus, Holly R; Schaffner, Erin K; Kantawala, Sajel; Carcillo, Joseph; Rosen, Johanna; Horvat, Christopher M

    2018-03-01

    Sepsis kills almost 5,000 children annually, accounting for 16% of pediatric health care spending in the United States. We sought to identify sepsis within the Electronic Health Record (EHR) of a quaternary children's hospital to characterize disease incidence, improve recognition and response, and track performance metrics. Methods are organized in a plan-do-study-act cycle. During the "plan" phase, electronic definitions of sepsis (blood culture and antibiotic within 24 hours) and septic shock (sepsis plus vasoactive medication) were created to establish benchmark data and track progress with statistical process control. The performance of a screening tool was evaluated in the emergency department. During the "do" phase, a novel inpatient workflow is being piloted, which involves regular sepsis screening by nurses using the tool, and a regimented response to high risk patients. Screening tool use in the emergency department reduced time to antibiotics (Fig. 1). Of the 6,159 admissions, EHR definitions identified 1,433 (23.3%) between July and December 2016 with sepsis, of which 159 (11.1%) had septic shock. Hospital mortality for all sepsis patients was 2.2% and 15.7% for septic shock (Table 1). These findings approximate epidemiologic studies of sepsis and severe sepsis, which report a prevalence range of 0.45-8.2% and mortality range of 8.2-25% (Table 2). 1-5 . Implementation of a sepsis screening tool is associated with improved performance. The prevalence of sepsis conditions identified with electronic definitions approximates the epidemiologic landscape characterized by other point-prevalence and administrative studies, providing face validity to this approach, and proving useful for tracking performance improvement.

  17. Performance profiling for brachytherapy applications

    NASA Astrophysics Data System (ADS)

    Choi, Wonqook; Cho, Kihyeon; Yeo, Insung

    2018-05-01

    In many physics applications, a significant amount of software (e.g. R, ROOT and Geant4) is developed on novel computing architectures, and much effort is expended to ensure the software is efficient in terms of central processing unit (CPU) time and memory usage. Profiling tools are used during the evaluation process to evaluate the efficiency; however, few such tools are able to accommodate low-energy physics regions. To address this limitation, we developed a low-energy physics profiling system in Geant4 to profile the CPU time and memory of software applications in brachytherapy applications. This paper describes and evaluates specific models that are applied to brachytherapy applications in Geant4, such as QGSP_BIC_LIV, QGSP_BIC_EMZ, and QGSP_BIC_EMY. The physics range in this tool allows it to be used to generate low energy profiles in brachytherapy applications. This was a limitation in previous studies, which caused us to develop a new profiling tool that supports profiling in the MeV range, in contrast to the TeV range that is supported by existing high-energy profiling tools. In order to easily compare the profiling results between low-energy and high-energy modes, we employed the same software architecture as that in the SimpliCarlo tool developed at the Fermilab National Accelerator Laboratory (FNAL) for the Large Hadron Collider (LHC). The results show that the newly developed profiling system for low-energy physics (less than MeV) complements the current profiling system used for high-energy physics (greater than TeV) applications.

  18. Tools for assessing fall risk in the elderly: a systematic review and meta-analysis.

    PubMed

    Park, Seong-Hi

    2018-01-01

    The prevention of falls among the elderly is arguably one of the most important public health issues in today's aging society. The aim of this study was to assess which tools best predict the risk of falls in the elderly. Electronic searches were performed using Medline, EMBASE, the Cochrane Library, CINAHL, etc., using the following keywords: "fall risk assessment", "elderly fall screening", and "elderly mobility scale". The QUADAS-2 was applied to assess the internal validity of the diagnostic studies. Selected studies were meta-analyzed with MetaDisc 1.4. A total of 33 studies were eligible out of the 2,321 studies retrieved from selected databases. Twenty-six assessment tools for fall risk were used in the selected articles, and they tended to vary based on the setting. The fall risk assessment tools currently used for the elderly did not show sufficiently high predictive validity for differentiating high and low fall risks. The Berg Balance scale and Mobility Interaction Fall chart showed stable and high specificity, while the Downton Fall Risk Index, Hendrich II Fall Risk Model, St. Thomas's Risk Assessment Tool in Falling elderly inpatients, Timed Up and Go test, and Tinetti Balance scale showed the opposite results. We concluded that rather than a single measure, two assessment tools used together would better evaluate the characteristics of falls by the elderly that can occur due to a multitude of factors and maximize the advantages of each for predicting the occurrence of falls.

  19. Are normative sonographic values of kidney size in children valid and reliable? A systematic review of the methodological quality of ultrasound studies using the Anatomical Quality Assessment (AQUA) tool.

    PubMed

    Chhapola, Viswas; Tiwari, Soumya; Deepthi, Bobbity; Henry, Brandon Michael; Brar, Rekha; Kanwal, Sandeep Kumar

    2018-06-01

    A plethora of research is available on ultrasonographic kidney size standards. We performed a systematic review of methodological quality of ultrasound studies aimed at developing normative renal parameters in healthy children, by evaluating the risk of bias (ROB) using the 'Anatomical Quality Assessment (AQUA)' tool. We searched Medline, Scopus, CINAHL, and Google Scholar on June 04 2018, and observational studies measuring kidney size by ultrasonography in healthy children (0-18 years) were included. The ROB of each study was evaluated in five domains using a 20 item coding scheme based on AQUA tool framework. Fifty-four studies were included. Domain 1 (subject characteristics) had a high ROB in 63% of studies due to the unclear description of age, sex, and ethnicity. The performance in Domain 2 (study design) was the best with 85% of studies having a prospective design. Methodological characterization (Domain 3) was poor across the studies (< 10% compliance), with suboptimal performance in the description of patient positioning, operator experience, and assessment of intra/inter-observer reliability. About three-fourth of the studies had a low ROB in Domain 4 (descriptive anatomy). Domain 5 (reporting of results) had a high ROB in approximately half of the studies, the majority reporting results in the form of central tendency measures. Significant deficiencies and heterogeneity were observed in the methodological quality of USG studies performed to-date for measurement of kidney size in children. We hereby provide a framework for the conducting such studies in future. PROSPERO (CRD42017071601).

  20. MiRduplexSVM: A High-Performing MiRNA-Duplex Prediction and Evaluation Methodology

    PubMed Central

    Karathanasis, Nestoras; Tsamardinos, Ioannis; Poirazi, Panayiota

    2015-01-01

    We address the problem of predicting the position of a miRNA duplex on a microRNA hairpin via the development and application of a novel SVM-based methodology. Our method combines a unique problem representation and an unbiased optimization protocol to learn from mirBase19.0 an accurate predictive model, termed MiRduplexSVM. This is the first model that provides precise information about all four ends of the miRNA duplex. We show that (a) our method outperforms four state-of-the-art tools, namely MaturePred, MiRPara, MatureBayes, MiRdup as well as a Simple Geometric Locator when applied on the same training datasets employed for each tool and evaluated on a common blind test set. (b) In all comparisons, MiRduplexSVM shows superior performance, achieving up to a 60% increase in prediction accuracy for mammalian hairpins and can generalize very well on plant hairpins, without any special optimization. (c) The tool has a number of important applications such as the ability to accurately predict the miRNA or the miRNA*, given the opposite strand of a duplex. Its performance on this task is superior to the 2nts overhang rule commonly used in computational studies and similar to that of a comparative genomic approach, without the need for prior knowledge or the complexity of performing multiple alignments. Finally, it is able to evaluate novel, potential miRNAs found either computationally or experimentally. In relation with recent confidence evaluation methods used in miRBase, MiRduplexSVM was successful in identifying high confidence potential miRNAs. PMID:25961860

  1. Relationship between medication event rates and the Leapfrog computerized physician order entry evaluation tool.

    PubMed

    Leung, Alexander A; Keohane, Carol; Lipsitz, Stuart; Zimlichman, Eyal; Amato, Mary; Simon, Steven R; Coffey, Michael; Kaufman, Nathan; Cadet, Bismarck; Schiff, Gordon; Seger, Diane L; Bates, David W

    2013-06-01

    The Leapfrog CPOE evaluation tool has been promoted as a means of monitoring computerized physician order entry (CPOE). We sought to determine the relationship between Leapfrog scores and the rates of preventable adverse drug events (ADE) and potential ADE. A cross-sectional study of 1000 adult admissions in five community hospitals from October 1, 2008 to September 30, 2010 was performed. Observed rates of preventable ADE and potential ADE were compared with scores reported by the Leapfrog CPOE evaluation tool. The primary outcome was the rate of preventable ADE and the secondary outcome was the composite rate of preventable ADE and potential ADE. Leapfrog performance scores were highly related to the primary outcome. A 43% relative reduction in the rate of preventable ADE was predicted for every 5% increase in Leapfrog scores (rate ratio 0.57; 95% CI 0.37 to 0.88). In absolute terms, four fewer preventable ADE per 100 admissions were predicted for every 5% increase in overall Leapfrog scores (rate difference -4.2; 95% CI -7.4 to -1.1). A statistically significant relationship between Leapfrog scores and the secondary outcome, however, was not detected. Our findings support the use of the Leapfrog tool as a means of evaluating and monitoring CPOE performance after implementation, as addressed by current certification standards. Scores from the Leapfrog CPOE evaluation tool closely relate to actual rates of preventable ADE. Leapfrog testing may alert providers to potential vulnerabilities and highlight areas for further improvement.

  2. Comparing the effectiveness of TWEAK and T-ACE in determining problem drinkers in pregnancy.

    PubMed

    Sarkar, M; Einarson, T; Koren, G

    2010-01-01

    The TWEAK and T-ACE screening tools are validated methods of identifying problem drinking in a pregnant population. The objective of this study was to compare the effectiveness of the TWEAK and T-ACE screening tools in identifying problem drinking using traditional cut-points (CP). Study participants consisted of women calling the Motherisk Alcohol Helpline for information regarding their alcohol use in pregnancy. In this cohort, concerns surrounding underreporting are not likely as women self-report their alcohol consumption. Participant's self-identification, confirmed by her amount of alcohol use, determined whether she was a problem drinker or not. The TWEAK and T-ACE tools were administered on both groups and subsequent analysis was done to determine if one tool was more effective in predicting problem drinking. The study consisted of 75 problem and 100 non-problem drinkers. Using traditional CP, the TWEAK and T-ACE tools both performed similarly at identifying potential at-risk women (positive predictive value = 0.54), with very high sensitivity rates (100-99% and 100-93%, respectively) but poor specificity rates (36-43% and 19-34%, respectively). Upon comparison, there was no statistical difference in the effectiveness for one test performing better than next using either CP of 2 (P = 0.66) or CP of 3 (P = 0.38). Despite the lack of difference in performance, improved specificity associated with TWEAK suggests that it may be better suited to screen at-risk populations seeking advice from a helpline.

  3. Integrative Genomics Viewer (IGV) | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    The Integrative Genomics Viewer (IGV) is a high-performance visualization tool for interactive exploration of large, integrated genomic datasets. It supports a wide variety of data types, including array-based and next-generation sequence data, and genomic annotations.

  4. Taverna: a tool for building and running workflows of services

    PubMed Central

    Hull, Duncan; Wolstencroft, Katy; Stevens, Robert; Goble, Carole; Pocock, Mathew R.; Li, Peter; Oinn, Tom

    2006-01-01

    Taverna is an application that eases the use and integration of the growing number of molecular biology tools and databases available on the web, especially web services. It allows bioinformaticians to construct workflows or pipelines of services to perform a range of different analyses, such as sequence analysis and genome annotation. These high-level workflows can integrate many different resources into a single analysis. Taverna is available freely under the terms of the GNU Lesser General Public License (LGPL) from . PMID:16845108

  5. EPSAT - A workbench for designing high-power systems for the space environment

    NASA Technical Reports Server (NTRS)

    Kuharski, R. A.; Jongeward, G. A.; Wilcox, K. G.; Kennedy, E. M.; Stevens, N. J.; Putnam, R. M.; Roche, J. C.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining the performance of power systems in both naturally occurring and self-induced environments. This paper presents the results of the project after two years of a three-year development program. The relevance of the project result for SDI are pointed out, and models of the interaction of the environment and power systems are discussed.

  6. High power laser downhole cutting tools and systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zediker, Mark S; Rinzler, Charles C; Faircloth, Brian O

    Downhole cutting systems, devices and methods for utilizing 10 kW or more laser energy transmitted deep into the earth with the suppression of associated nonlinear phenomena. Systems and devices for the laser cutting operations within a borehole in the earth. These systems and devices can deliver high power laser energy down a deep borehole, while maintaining the high power to perform cutting operations in such boreholes deep within the earth.

  7. RAMICS: trainable, high-speed and biologically relevant alignment of high-throughput sequencing reads to coding DNA

    PubMed Central

    Wright, Imogen A.; Travers, Simon A.

    2014-01-01

    The challenge presented by high-throughput sequencing necessitates the development of novel tools for accurate alignment of reads to reference sequences. Current approaches focus on using heuristics to map reads quickly to large genomes, rather than generating highly accurate alignments in coding regions. Such approaches are, thus, unsuited for applications such as amplicon-based analysis and the realignment phase of exome sequencing and RNA-seq, where accurate and biologically relevant alignment of coding regions is critical. To facilitate such analyses, we have developed a novel tool, RAMICS, that is tailored to mapping large numbers of sequence reads to short lengths (<10 000 bp) of coding DNA. RAMICS utilizes profile hidden Markov models to discover the open reading frame of each sequence and aligns to the reference sequence in a biologically relevant manner, distinguishing between genuine codon-sized indels and frameshift mutations. This approach facilitates the generation of highly accurate alignments, accounting for the error biases of the sequencing machine used to generate reads, particularly at homopolymer regions. Performance improvements are gained through the use of graphics processing units, which increase the speed of mapping through parallelization. RAMICS substantially outperforms all other mapping approaches tested in terms of alignment quality while maintaining highly competitive speed performance. PMID:24861618

  8. Miniature Biometric Sensor Project

    NASA Technical Reports Server (NTRS)

    Falker, John; Terrier, Douglas; Clayton, Ronald; Hanson, Andrea; Cooper, Tommy; Downs, Meghan; Flint, Stephanie; Reyna, Baraquiel; Simon, Cory; Wilt, Grier

    2015-01-01

    Heart rate monitoring (HRM) is a critical need during exploration missions. Unlike the four separate systems used on ISS today, the single HRM system should perform as a diagnostic tool, perform well during exercise or high level activity, and be suitable for use during EVA. Currently available HRM technologies are dependent on uninterrupted contact with the skin and are prone to data drop-out and motion artifact when worn in the spacesuit or during exercise. Here, we seek an alternative to the chest strap and electrode based sensors currently in use on ISS today. This project aims to develop a single, high performance, robust biosensor with focused efforts on improved heart rate data quality collection during high intensity activity such as exercise or EVA.

  9. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  10. Methodologic Guide for Evaluating Clinical Performance and Effect of Artificial Intelligence Technology for Medical Diagnosis and Prediction.

    PubMed

    Park, Seong Ho; Han, Kyunghwa

    2018-03-01

    The use of artificial intelligence in medicine is currently an issue of great interest, especially with regard to the diagnostic or predictive analysis of medical images. Adoption of an artificial intelligence tool in clinical practice requires careful confirmation of its clinical utility. Herein, the authors explain key methodology points involved in a clinical evaluation of artificial intelligence technology for use in medicine, especially high-dimensional or overparameterized diagnostic or predictive models in which artificial deep neural networks are used, mainly from the standpoints of clinical epidemiology and biostatistics. First, statistical methods for assessing the discrimination and calibration performances of a diagnostic or predictive model are summarized. Next, the effects of disease manifestation spectrum and disease prevalence on the performance results are explained, followed by a discussion of the difference between evaluating the performance with use of internal and external datasets, the importance of using an adequate external dataset obtained from a well-defined clinical cohort to avoid overestimating the clinical performance as a result of overfitting in high-dimensional or overparameterized classification model and spectrum bias, and the essentials for achieving a more robust clinical evaluation. Finally, the authors review the role of clinical trials and observational outcome studies for ultimate clinical verification of diagnostic or predictive artificial intelligence tools through patient outcomes, beyond performance metrics, and how to design such studies. © RSNA, 2018.

  11. Plasma Doping—Enabling Technology for High Dose Logic and Memory Applications

    NASA Astrophysics Data System (ADS)

    Miller, T.; Godet, L.; Papasouliotis, G. D.; Singh, V.

    2008-11-01

    As logic and memory device dimensions shrink with each generation, there are more high dose implants at lower energies. Examples include dual poly gate (also referred to as counter-doped poly), elevated source drain and contact plug implants. Plasma Doping technology throughput and dopant profile benefits at these ultra high dose and lower energy conditions have been well established [1,2,3]. For the first time a production-worthy plasma doping implanter, the VIISta PLAD tool, has been developed with unique architecture suited for precise and repeatable dopant placement. Critical elements of the architecture include pulsed DC wafer bias, closed-loop dosimetry and a uniform low energy, high density plasma source. In this paper key performance metrics such as dose uniformity, dose repeatability and dopant profile control will be presented that demonstrate the production-worthiness of the VIISta PLAD tool for several high dose applications.

  12. Using the Leitz LMS 2000 for monitoring and improvement of an e-beam

    NASA Astrophysics Data System (ADS)

    Blaesing-Bangert, Carola; Roeth, Klaus-Dieter; Ogawa, Yoichi

    1994-11-01

    Kaizen--a continuously improving--is a philosophy lived in Japan which is also becoming more and more important in Western companies. To implement this philosophy in the semiconductor industry, a high performance metrology tool is essential to determine the status of production quality periodically. An important prerequisite for statistical process control is the high stability of the metrology tool over several months or years; the tool-induced shift should be as small as possible. The pattern placement metrology tool Leitz LMS 2000 has been used in a major European mask house for several years now to qualify masks within the tightest specifications and to monitor the MEBES III and its cassettes. The mask shop's internal specification for the long term repeatability of the pattern placement metrology tool is 19 nm instead of 42 nm as specified by the supplier of the tool. Then the process capability of the LMS 2000 over 18 months is represented by an average cpk value of 2.8 for orthogonality, 5.2 for x-scaling, and 3.0 for y-scaling. The process capability of the MEBES III and its cassettes was improved in the past years. For instance, 100% of the masks produced with a process tolerance of +/- 200 nm are now within this limit.

  13. msBiodat analysis tool, big data analysis for high-throughput experiments.

    PubMed

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  14. Prediction of miRNA targets.

    PubMed

    Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis

    2015-01-01

    Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.

  15. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments.

    PubMed

    Marčan, Marija; Pavliha, Denis; Kos, Bor; Forjanič, Tadeja; Miklavčič, Damijan

    2015-01-01

    Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed.

  16. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments

    PubMed Central

    2015-01-01

    Background Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. Methods In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Results Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. Conclusions The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed. PMID:26356007

  17. Can Accelerators Accelerate Learning?

    NASA Astrophysics Data System (ADS)

    Santos, A. C. F.; Fonseca, P.; Coelho, L. F. S.

    2009-03-01

    The 'Young Talented' education program developed by the Brazilian State Funding Agency (FAPERJ) [1] makes it possible for high-schools students from public high schools to perform activities in scientific laboratories. In the Atomic and Molecular Physics Laboratory at Federal University of Rio de Janeiro (UFRJ), the students are confronted with modern research tools like the 1.7 MV ion accelerator. Being a user-friendly machine, the accelerator is easily manageable by the students, who can perform simple hands-on activities, stimulating interest in physics, and getting the students close to modern laboratory techniques.

  18. PetIGA-MF: A multi-field high-performance toolbox for structure-preserving B-splines spaces

    DOE PAGES

    Sarmiento, Adel; Cortes, Adriano; Garcia, Daniel; ...

    2016-10-07

    We describe the development of a high-performance solution framework for isogeometric discrete differential forms based on B-splines: PetIGA-MF. Built on top of PetIGA, PetIGA-MF is a general multi-field discretization tool. To test the capabilities of our implementation, we solve different viscous flow problems such as Darcy, Stokes, Brinkman, and Navier-Stokes equations. Several convergence benchmarks based on manufactured solutions are presented assuring optimal convergence rates of the approximations, showing the accuracy and robustness of our solver.

  19. Novel tool wear monitoring method in milling difficult-to-machine materials using cutting chip formation

    NASA Astrophysics Data System (ADS)

    Zhang, P. P.; Guo, Y.; Wang, B.

    2017-05-01

    The main problems in milling difficult-to-machine materials are the high cutting temperature and rapid tool wear. However it is impossible to investigate tool wear in machining. Tool wear and cutting chip formation are two of the most important representations for machining efficiency and quality. The purpose of this paper is to develop the model of tool wear with cutting chip formation (width of chip and radian of chip) on difficult-to-machine materials. Thereby tool wear is monitored by cutting chip formation. A milling experiment on the machining centre with three sets cutting parameters was performed to obtain chip formation and tool wear. The experimental results show that tool wear increases gradually along with cutting process. In contrast, width of chip and radian of chip decrease. The model is developed by fitting the experimental data and formula transformations. The most of monitored errors of tool wear by the chip formation are less than 10%. The smallest error is 0.2%. Overall errors by the radian of chip are less than the ones by the width of chip. It is new way to monitor and detect tool wear by cutting chip formation in milling difficult-to-machine materials.

  20. Foundational Tools for Petascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building toolsmore » and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.« less

  1. High speed civil transport aerodynamic optimization

    NASA Technical Reports Server (NTRS)

    Ryan, James S.

    1994-01-01

    This is a report of work in support of the Computational Aerosciences (CAS) element of the Federal HPCC program. Specifically, CFD and aerodynamic optimization are being performed on parallel computers. The long-range goal of this work is to facilitate teraflops-rate multidisciplinary optimization of aerospace vehicles. This year's work is targeted for application to the High Speed Civil Transport (HSCT), one of four CAS grand challenges identified in the HPCC FY 1995 Blue Book. This vehicle is to be a passenger aircraft, with the promise of cutting overseas flight time by more than half. To meet fuel economy, operational costs, environmental impact, noise production, and range requirements, improved design tools are required, and these tools must eventually integrate optimization, external aerodynamics, propulsion, structures, heat transfer, controls, and perhaps other disciplines. The fundamental goal of this project is to contribute to improved design tools for U.S. industry, and thus to the nation's economic competitiveness.

  2. Computational homogenisation for thermoviscoplasticity: application to thermally sprayed coatings

    NASA Astrophysics Data System (ADS)

    Berthelsen, Rolf; Denzer, Ralf; Oppermann, Philip; Menzel, Andreas

    2017-11-01

    Metal forming processes require wear-resistant tool surfaces in order to ensure a long life cycle of the expensive tools together with a constant high quality of the produced components. Thermal spraying is a relatively widely applied coating technique for the deposit of wear protection coatings. During these coating processes, heterogeneous coatings are deployed at high temperatures followed by quenching where residual stresses occur which strongly influence the performance of the coated tools. The objective of this article is to discuss and apply a thermo-mechanically coupled simulation framework which captures the heterogeneity of the deposited coating material. Therefore, a two-scale finite element framework for the solution of nonlinear thermo-mechanically coupled problems is elaborated and applied to the simulation of thermoviscoplastic material behaviour including nonlinear thermal softening in a geometrically linearised setting. The finite element framework and material model is demonstrated by means of numerical examples.

  3. Computer-Aided Design and 3-Dimensional Printing for Costal Cartilage Simulation of Airway Graft Carving.

    PubMed

    Ha, Jennifer F; Morrison, Robert J; Green, Glenn E; Zopf, David A

    2017-06-01

    Autologous cartilage grafting during open airway reconstruction is a complex skill instrumental to the success of the operation. Most trainees lack adequate opportunities to develop proficiency in this skill. We hypothesized that 3-dimensional (3D) printing and computer-aided design can be used to create a high-fidelity simulator for developing skills carving costal cartilage grafts for airway reconstruction. The rapid manufacturing and low cost of the simulator allow deployment in locations lacking expert instructors or cadaveric dissection, such as medical missions and Third World countries. In this blinded, prospective observational study, resident trainees completed a physical simulator exercise using a 3D-printed costal cartilage grafting tool. Participant assessment was performed using a Likert scale questionnaire, and airway grafts were assessed by a blinded expert surgeon. Most participants found this to be a very relevant training tool and highly rated the level of realism of the simulation tool.

  4. HC StratoMineR: A Web-Based Tool for the Rapid Analysis of High-Content Datasets.

    PubMed

    Omta, Wienand A; van Heesbeen, Roy G; Pagliero, Romina J; van der Velden, Lieke M; Lelieveld, Daphne; Nellen, Mehdi; Kramer, Maik; Yeong, Marley; Saeidi, Amir M; Medema, Rene H; Spruit, Marco; Brinkkemper, Sjaak; Klumperman, Judith; Egan, David A

    2016-10-01

    High-content screening (HCS) can generate large multidimensional datasets and when aligned with the appropriate data mining tools, it can yield valuable insights into the mechanism of action of bioactive molecules. However, easy-to-use data mining tools are not widely available, with the result that these datasets are frequently underutilized. Here, we present HC StratoMineR, a web-based tool for high-content data analysis. It is a decision-supportive platform that guides even non-expert users through a high-content data analysis workflow. HC StratoMineR is built by using My Structured Query Language for storage and querying, PHP: Hypertext Preprocessor as the main programming language, and jQuery for additional user interface functionality. R is used for statistical calculations, logic and data visualizations. Furthermore, C++ and graphical processor unit power is diffusely embedded in R by using the rcpp and rpud libraries for operations that are computationally highly intensive. We show that we can use HC StratoMineR for the analysis of multivariate data from a high-content siRNA knock-down screen and a small-molecule screen. It can be used to rapidly filter out undesirable data; to select relevant data; and to perform quality control, data reduction, data exploration, morphological hit picking, and data clustering. Our results demonstrate that HC StratoMineR can be used to functionally categorize HCS hits and, thus, provide valuable information for hit prioritization.

  5. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data.

    PubMed

    Zeng, Ying-Xu; Mjøs, Svein Are; David, Fabrice P A; Schmid, Adrien W

    2016-03-31

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Effects of Concept-Mapping-Based Interactive E-Books on Active and Reflective-Style Students' Learning Performances in Junior High School Law Courses

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Sung, Han-Yu; Chang, Hsuan

    2017-01-01

    Researchers have pointed out that interactive e-books have rich content and interactive features which can promote students' learning interest. However, researchers have also indicated the need to integrate effective learning supports or tools to help students organize what they have learned so as to increase their learning performance, in…

  7. Using competences and competence tools in workforce development.

    PubMed

    Green, Tess; Dickerson, Claire; Blass, Eddie

    The NHS Knowledge and Skills Framework (KSF) has been a driving force in the move to competence-based workforce development in the NHS. Skills for Health has developed national workforce competences that aim to improve behavioural performance, and in turn increase productivity. This article describes five projects established to test Skills for Health national workforce competences, electronic tools and products in different settings in the NHS. Competences and competence tools were used to redesign services, develop job roles, identify skills gaps and develop learning programmes. Reported benefits of the projects included increased clarity and a structured, consistent and standardized approach to workforce development. Findings from the evaluation of the tools were positive in terms of their overall usefulness and provision of related training/support. Reported constraints of using the competences and tools included issues relating to their availability, content and organization. It is recognized that a highly skilled and flexible workforce is important to the delivery of high-quality health care. These projects suggest that Skills for Health competences can be used as a 'common currency' in workforce development in the UK health sector. This would support the need to adapt rapidly to changing service needs.

  8. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    PubMed

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  9. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition dion and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. The knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use is discussed. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  10. Development of a knowledge acquisition tool for an expert system flight status monitor

    NASA Technical Reports Server (NTRS)

    Disbrow, J. D.; Duke, E. L.; Regenie, V. A.

    1986-01-01

    Two of the main issues in artificial intelligence today are knowledge acquisition and knowledge representation. The Dryden Flight Research Facility of NASA's Ames Research Center is presently involved in the design and implementation of an expert system flight status monitor that will provide expertise and knowledge to aid the flight systems engineer in monitoring today's advanced high-performance aircraft. The flight status monitor can be divided into two sections: the expert system itself and the knowledge acquisition tool. This paper discusses the knowledge acquisition tool, the means it uses to extract knowledge from the domain expert, and how that knowledge is represented for computer use. An actual aircraft system has been codified by this tool with great success. Future real-time use of the expert system has been facilitated by using the knowledge acquisition tool to easily generate a logically consistent and complete knowledge base.

  11. Application of HFCT and UHF Sensors in On-Line Partial Discharge Measurements for Insulation Diagnosis of High Voltage Equipment

    PubMed Central

    Álvarez, Fernando; Garnacho, Fernando; Ortego, Javier; Sánchez-Urán, Miguel Ángel

    2015-01-01

    Partial discharge (PD) measurements provide valuable information for assessing the condition of high voltage (HV) insulation systems, contributing to their quality assurance. Different PD measuring techniques have been developed in the last years specially designed to perform on-line measurements. Non-conventional PD methods operating in high frequency bands are usually used when this type of tests are carried out. In PD measurements the signal acquisition, the subsequent signal processing and the capability to obtain an accurate diagnosis are conditioned by the selection of a suitable detection technique and by the implementation of effective signal processing tools. This paper proposes an optimized electromagnetic detection method based on the combined use of wideband PD sensors for measurements performed in the HF and UHF frequency ranges, together with the implementation of powerful processing tools. The effectiveness of the measuring techniques proposed is demonstrated through an example, where several PD sources are measured simultaneously in a HV installation consisting of a cable system connected by a plug-in terminal to a gas insulated substation (GIS) compartment. PMID:25815452

  12. Performance Contracting as a Performance Management Tool in the Public Sector in Kenya: Lessons of learning

    ERIC Educational Resources Information Center

    Hope, Kempe Ronald, Sr.

    2013-01-01

    The purpose of this article is to provide an assessment and analysis of public sector performance contracting as a performance management tool in Kenya. It aims to demonstrate that performance contracting remains a viable and important tool for improving public sector performance as a key element of the on-going public sector transformation…

  13. Progress on EUV mask fabrication for 32-nm technology node and beyond

    NASA Astrophysics Data System (ADS)

    Zhang, Guojing; Yan, Pei-Yang; Liang, Ted; Park, Seh-jin; Sanchez, Peter; Shu, Emily Y.; Ultanir, Erdem A.; Henrichs, Sven; Stivers, Alan; Vandentop, Gilroy; Lieberman, Barry; Qu, Ping

    2007-05-01

    Extreme ultraviolet lithography (EUVL) tool development achieved a big milestone last year as two full-field Alpha Demo Tools (ADT) were shipped to customers by ASML. In the future horizon, a full field "EUV1" exposure tool from Nikon will be available by the end of 20071 and the pre-production EUV exposure tools from ASML are targeted for 20092. It is essential that high quality EUVL masks can be made and delivered to the EUVL tool users to support the technology development. In the past year, we have demonstrated mask fabrication with low stress absorber deposition and good etch process control yielding a vertical etch profile and a mask CD control of 5.7 nm for 32 nm (1x) space and 7.4 nm for 32 nm (1x) lines. Mask pattern resolution of 15 nm (1x) dense lines was achieved. Full field reflective mask die-to-die inspection at a 125nm pixel size was demonstrated after low defect multilayer blanks became available. In this paper, we will present details of the Intel EUVL Mask Pilot Line progress in EUVL mask defect reduction, pattern CD performance, program defect mask design and inspection, in-house absorber film development and its performance, and EUVL metrology tool development. We will demonstrate an overall improvement in EUV mask manufacturing readiness due to our Pilot Line activities.

  14. Geothopica and the interactive analysis and visualization of the updated Italian National Geothermal Database

    NASA Astrophysics Data System (ADS)

    Trumpy, Eugenio; Manzella, Adele

    2017-02-01

    The Italian National Geothermal Database (BDNG), is the largest collection of Italian Geothermal data and was set up in the 1980s. It has since been updated both in terms of content and management tools: information on deep wells and thermal springs (with temperature > 30 °C) are currently organized and stored in a PostgreSQL relational database management system, which guarantees high performance, data security and easy access through different client applications. The BDNG is the core of the Geothopica web site, whose webGIS tool allows different types of user to access geothermal data, to visualize multiple types of datasets, and to perform integrated analyses. The webGIS tool has been recently improved by two specially designed, programmed and implemented visualization tools to display data on well lithology and underground temperatures. This paper describes the contents of the database and its software and data update, as well as the webGIS tool including the new tools for data lithology and temperature visualization. The geoinformation organized in the database and accessible through Geothopica is of use not only for geothermal purposes, but also for any kind of georesource and CO2 storage project requiring the organization of, and access to, deep underground data. Geothopica also supports project developers, researchers, and decision makers in the assessment, management and sustainable deployment of georesources.

  15. Blue emitting undecaplatinum clusters

    NASA Astrophysics Data System (ADS)

    Chakraborty, Indranath; Bhuin, Radha Gobinda; Bhat, Shridevi; Pradeep, T.

    2014-07-01

    A blue luminescent 11-atom platinum cluster showing step-like optical features and the absence of plasmon absorption was synthesized. The cluster was purified using high performance liquid chromatography (HPLC). Electrospray ionization (ESI) and matrix assisted laser desorption ionization (MALDI) mass spectrometry (MS) suggest a composition, Pt11(BBS)8, which was confirmed by a range of other experimental tools. The cluster is highly stable and compatible with many organic solvents.A blue luminescent 11-atom platinum cluster showing step-like optical features and the absence of plasmon absorption was synthesized. The cluster was purified using high performance liquid chromatography (HPLC). Electrospray ionization (ESI) and matrix assisted laser desorption ionization (MALDI) mass spectrometry (MS) suggest a composition, Pt11(BBS)8, which was confirmed by a range of other experimental tools. The cluster is highly stable and compatible with many organic solvents. Electronic supplementary information (ESI) available: Details of experimental procedures, instrumentation, chromatogram of the crude cluster; SEM/EDAX, DLS, PXRD, TEM, FT-IR, and XPS of the isolated Pt11 cluster; UV/Vis, MALDI MS and SEM/EDAX of isolated 2 and 3; and 195Pt NMR of the K2PtCl6 standard. See DOI: 10.1039/c4nr02778g

  16. Spectral and Concentration Sensitivity of Multijunction Solar Cells at High Temperature: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, Daniel J.; Steiner, Myles A.; Perl, Emmett E.

    2017-06-14

    We model the performance of two-junction solar cells at very high temperatures of ~400 degrees C and beyond for applications such as hybrid PV/solar-thermal power production, and identify areas in which the design and performance characteristics behave significantly differently than at more conventional near-room-temperature operating conditions. We show that high-temperature operation reduces the sensitivity of the cell efficiency to spectral content, but increases the sensitivity to concentration, both of which have implications for energy yield in terrestrial PV applications. For other high-temperature applications such as near-sun space missions, our findings indicate that concentration may be a useful tool to enhancemore » cell efficiency.« less

  17. Statistical comparison of a hybrid approach with approximate and exact inference models for Fusion 2+

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew

    2007-04-01

    One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.

  18. High performance wire grid polarizers using jet and flashTM imprint lithography

    NASA Astrophysics Data System (ADS)

    Ahn, Sean; Yang, Jack; Miller, Mike; Ganapathisubramanian, Maha; Menezes, Marlon; Choi, Jin; Xu, Frank; Resnick, Douglas J.; Sreenivasan, S. V.

    2013-03-01

    The ability to pattern materials at the nanoscale can enable a variety of applications ranging from high density data storage, displays, photonic devices and CMOS integrated circuits to emerging applications in the biomedical and energy sectors. These applications require varying levels of pattern control, short and long range order, and have varying cost tolerances. Extremely large area roll to roll (R2R) manufacturing on flexible substrates is ubiquitous for applications such as paper and plastic processing. It combines the benefits of high speed and inexpensive substrates to deliver a commodity product at low cost. The challenge is to extend this approach to the realm of nanopatterning and realize similar benefits. The cost of manufacturing is typically driven by speed (or throughput), tool complexity, cost of consumables (materials used, mold or master cost, etc.), substrate cost, and the downstream processing required (annealing, deposition, etching, etc.). In order to achieve low cost nanopatterning, it is imperative to move towards high speed imprinting, less complex tools, near zero waste of consumables and low cost substrates. The Jet and Flash Imprint Lithography (J-FILTM) process uses drop dispensing of UV curable resists to assist high resolution patterning for subsequent dry etch pattern transfer. The technology is actively being used to develop solutions for memory markets including Flash memory and patterned media for hard disk drives. In this paper we have developed a roll based J-FIL process and applied it to technology demonstrator tool, the LithoFlex 100, to fabricate large area flexible bilayer wire grid polarizers (WGP) and high performance WGPs on rigid glass substrates. Extinction ratios of better than 10000 were obtained for the glass-based WGPs. Two simulation packages were also employed to understand the effects of pitch, aluminum thickness and pattern defectivity on the optical performance of the WGP devices. It was determined that the WGPs can be influenced by both clear and opaque defects in the gratings, however the defect densities are relaxed relative to the requirements of a high density semiconductor device.

  19. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user community comes new requests for algorithms and processing capabilities. To address this demand, OT is developing an extensible service based architecture for integrating community-developed software. This "plugable" approach to Web service deployment will enable new processing and analysis tools to run collocated with OT hosted data.

  20. Genome sequencing of bacteria: sequencing, de novo assembly and rapid analysis using open source tools.

    PubMed

    Kisand, Veljo; Lettieri, Teresa

    2013-04-01

    De novo genome sequencing of previously uncharacterized microorganisms has the potential to open up new frontiers in microbial genomics by providing insight into both functional capabilities and biodiversity. Until recently, Roche 454 pyrosequencing was the NGS method of choice for de novo assembly because it generates hundreds of thousands of long reads (<450 bps), which are presumed to aid in the analysis of uncharacterized genomes. The array of tools for processing NGS data are increasingly free and open source and are often adopted for both their high quality and role in promoting academic freedom. The error rate of pyrosequencing the Alcanivorax borkumensis genome was such that thousands of insertions and deletions were artificially introduced into the finished genome. Despite a high coverage (~30 fold), it did not allow the reference genome to be fully mapped. Reads from regions with errors had low quality, low coverage, or were missing. The main defect of the reference mapping was the introduction of artificial indels into contigs through lower than 100% consensus and distracting gene calling due to artificial stop codons. No assembler was able to perform de novo assembly comparable to reference mapping. Automated annotation tools performed similarly on reference mapped and de novo draft genomes, and annotated most CDSs in the de novo assembled draft genomes. Free and open source software (FOSS) tools for assembly and annotation of NGS data are being developed rapidly to provide accurate results with less computational effort. Usability is not high priority and these tools currently do not allow the data to be processed without manual intervention. Despite this, genome assemblers now readily assemble medium short reads into long contigs (>97-98% genome coverage). A notable gap in pyrosequencing technology is the quality of base pair calling and conflicting base pairs between single reads at the same nucleotide position. Regardless, using draft whole genomes that are not finished and remain fragmented into tens of contigs allows one to characterize unknown bacteria with modest effort.

Top