Sample records for analysis capability quick

  1. Aviation System Analysis Capability Quick Response System Report for Fiscal Year 1998

    NASA Technical Reports Server (NTRS)

    Ege, Russell; Villani, James; Ritter, Paul

    1999-01-01

    This document presents the additions and modifications made to the Quick Response System (QRS) in FY 1998 in support of the ASAC QRS development effort. this Document builds upon the Aviation System Analysis Capability Quick Responses System Report for Fiscal Year 1997.

  2. Aviation System Analysis Capability Quick Response System Report

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Ritter, Paul

    1998-01-01

    The purpose of this document is to present the additions and modifications made to the Aviation System Analysis Capability (ASAC) Quick Response System (QRS) in FY 1997 in support of the ASAC ORS development effort. This document contains an overview of the project background and scope and defines the QRS. The document also presents an overview of the Logistics Management Institute (LMI) facility that supports the QRS, and it includes a summary of the planned additions to the QRS in FY 1998. The document has five appendices.

  3. Control Scheme for Quickly Starting X-ray Tube

    NASA Astrophysics Data System (ADS)

    Nakahama, Masayuki; Nakanishi, Toshiki; Ishitobi, Manabu; Ito, Tuyoshi; Hosoda, Kenichi

    A control scheme for quickly starting a portable X-ray generator used in the livestock industry is proposed in this paper. A portable X-ray generator used to take X-ray images of animals such as horses, sheep and dogs should be capable of starting quickly because it is difficult for veterinarians to take X-ray images of animals at their timing. In order to develop a scheme for starting the X-ray tube quickly, it is necessary to analysis the X-ray tube. However, such an analysis has not been discussed until now. First, the states of an X-ray tube are classified into the temperature-limited state and the space-charge-limited state. Furthermore, existence of “mixed state” that comprises both is newly proposed in this paper. From these analyses, a novel scheme for quickly starting an X-ray generator is proposed; this scheme is considered with the characteristics of the X-ray tube. The proposed X-ray system that is capable of starting quickly is evaluated on the basis of experimental results.

  4. Analysis of the Capability Portfolio Review (CPR)

    DTIC Science & Technology

    2014-06-01

    facilitated by the MRM feature.  PAT allows the analyst to quickly change how summary depictions are generated. Choices include; simple linear...database with supporting software that documents relationships between warfighting activities, the UJTL, systems, ACTDs, roadmaps, and capability areas. It

  5. Aviation System Analysis Capability (ASAC) Quick Response System (QRS) Test Report

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Villani, James A.; Ritter, Paul

    1997-01-01

    This document is the Aviation System Analysis Capability (ASAC) Quick Response System (QRS) Test Report. The purpose of this document is to present the results of the QRS unit and system tests in support of the ASAC QRS development effort. This document contains an overview of the project background and scope, defines the QRS system and presents the additions made to the QRS this year, explains the assumptions, constraints, and approach used to conduct QRS Unit and System Testing, and presents the schedule used to perform QRS Testing. The document also presents an overview of the Logistics Management Institute (LMI) Test Facility and testing environment and summarizes the QRS Unit and System Test effort and results.

  6. Evaluation of the annual Canadian biodosimetry network intercomparisons

    PubMed Central

    Wilkins, Ruth C.; Beaton-Green, Lindsay A.; Lachapelle, Sylvie; Kutzner, Barbara C.; Ferrarotto, Catherine; Chauhan, Vinita; Marro, Leonora; Livingston, Gordon K.; Boulay Greene, Hillary; Flegal, Farrah N.

    2015-01-01

    Abstract Purpose: To evaluate the importance of annual intercomparisons for maintaining the capacity and capabilities of a well-established biodosimetry network in conjunction with assessing efficient and effective analysis methods for emergency response. Materials and methods: Annual intercomparisons were conducted between laboratories in the Canadian National Biological Dosimetry Response Plan. Intercomparisons were performed over a six-year period and comprised of the shipment of 10–12 irradiated, blinded blood samples for analysis by each of the participating laboratories. Dose estimates were determined by each laboratory using the dicentric chromosome assay (conventional and QuickScan scoring) and where possible the cytokinesis block micronucleus (CBMN) assay. Dose estimates were returned to the lead laboratory for evaluation and comparison. Results: Individual laboratories performed comparably from year to year with only slight fluctuations in performance. Dose estimates using the dicentric chromosome assay were accurate about 80% of the time and the QuickScan method for scoring the dicentric chromosome assay was proven to reduce the time of analysis without having a significant effect on the dose estimates. Although analysis with the CBMN assay was comparable to QuickScan scoring with respect to speed, the accuracy of the dose estimates was greatly reduced. Conclusions: Annual intercomparisons are necessary to maintain a network of laboratories for emergency response biodosimetry as they evoke confidence in their capabilities. PMID:25670072

  7. Uncoordinated MAC for Adaptive Multi Beam Directional Networks: Analysis and Evaluation

    DTIC Science & Technology

    2016-08-01

    control (MAC) policies for emerging systems that are equipped with fully digital antenna arrays which are capable of adaptive multi-beam directional...Adaptive Beam- forming, Multibeam, Directional Networking, Random Access, Smart Antennas I. INTRODUCTION Fully digital beamforming antenna arrays that...are capable of adaptive multi-beam communications are quickly becoming a reality. These antenna arrays allow users to form multiple simultaneous

  8. 9th Annual Systems Engineering Conference: Volume 4 Thursday

    DTIC Science & Technology

    2006-10-26

    Connectivity, Speed, Volume • Enterprise application integration • Workflow integration or multi-media • Federated search capability • Link analysis and...categorization, federated search & automated discovery of information — Collaborative tools to quickly share relevant information Built on commercial

  9. Multistage Analysis of Cyber Threats for Quick Mission Impact Assessment (CyberIA)

    DTIC Science & Technology

    2015-09-01

    Corporation. NVIDIA ® is a registered trademark of the NVIDIA Corporation. CUDA™ is a trademark of the NVIDIA Corporation. Released by J. Lee...for developing and integrating different high-performance C/C++ algorithms. This capability is significant because NVIDIA ® CUDA™ architecture

  10. TEC data ingestion into IRI and NeQuick over the antarctic region

    NASA Astrophysics Data System (ADS)

    Nava, Bruno; Pezzopane, Michael; Radicella, Sandro M.; Scotto, Carlo; Pietrella, Marco; Migoya Orue, Yenca; Alazo Cuartas, Katy; Kashcheyev, Anton

    2016-07-01

    In the present work a comparative analysis to evaluate the IRI and NeQuick 2 models capabilities in reproducing the ionospheric behaviour over the Antarctic Region has been performed. A technique to adapt the two models to GNSS-derived vertical Total Electron Content (TEC) has been therefore implemented to retrieve the 3-D ionosphere electron density at specific locations where ionosonde data were available. In particular, the electron density profiles used in this study have been provided in the framework of the AUSPICIO (AUtomatic Scaling of Polar Ionograms and Cooperative Ionospheric Observations) project applying the Adaptive Ionospheric Profiler (AIP) to ionograms recorded at eight selected mid, high-latitude and polar ionosondes. The relevant GNSS-derived vertical TEC values have been obtained from the Global Ionosphere Maps (GIM) produced by the Center for Orbit Determination in Europe (CODE). The effectiveness of the IRI and NeQuick 2 in reconstructing the ionosphere electron density at the given locations and epochs has been primarily assessed in terms of statistical comparison between experimental and model-retrieved peak parameters values (foF2 and hmF2). The analysis results indicate that in general the models are equivalent in their ability to reproduce the critical frequency of the F2 layer and they also tend to overestimate the height of the peak electron density, especially during high solar activity periods. Nevertheless this tendency is more noticeable in NeQuick 2 than in IRI. For completeness, the statistics indicating the models bottomside reconstruction capabilities, computed as height integrated electron density profile mismodeling, will also be discussed.

  11. Attomole-level Genomics with Single-molecule Direct DNA, cDNA and RNA Sequencing Technologies.

    PubMed

    Ozsolak, Fatih

    2016-01-01

    With the introduction of next-generation sequencing (NGS) technologies in 2005, the domination of microarrays in genomics quickly came to an end due to NGS's superior technical performance and cost advantages. By enabling genetic analysis capabilities that were not possible previously, NGS technologies have started to play an integral role in all areas of biomedical research. This chapter outlines the low-quantity DNA and cDNA sequencing capabilities and applications developed with the Helicos single molecule DNA sequencing technology.

  12. A PC-based multispectral scanner data evaluation workstation: Application to Daedalus scanners

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; James, Mark W.; Smith, Matthew R.; Atkinson, Robert J.

    1991-01-01

    In late 1989, a personal computer (PC)-based data evaluation workstation was developed to support post flight processing of Multispectral Atmospheric Mapping Sensor (MAMS) data. The MAMS Quick View System (QVS) is an image analysis and display system designed to provide the capability to evaluate Daedalus scanner data immediately after an aircraft flight. Even in its original form, the QVS offered the portability of a personal computer with the advanced analysis and display features of a mainframe image analysis system. It was recognized, however, that the original QVS had its limitations, both in speed and processing of MAMS data. Recent efforts are presented that focus on overcoming earlier limitations and adapting the system to a new data tape structure. In doing so, the enhanced Quick View System (QVS2) will accommodate data from any of the four spectrometers used with the Daedalus scanner on the NASA ER2 platform. The QVS2 is designed around the AST 486/33 MHz CPU personal computer and comes with 10 EISA expansion slots, keyboard, and 4.0 mbytes of memory. Specialized PC-McIDAS software provides the main image analysis and display capability for the system. Image analysis and display of the digital scanner data is accomplished with PC-McIDAS software.

  13. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  14. Data Needs for Labor Market Analysis. Background Paper No. 44.

    ERIC Educational Resources Information Center

    Horrigan, Michael W.

    Opinions on how the Bureau of Labor Statistics (BLS) can better meet the data needs of users of government-provided labor market data were sought from users inside and outside government. The following recommendations, among others, are based on those opinions: (1) create a quick-response household survey capability at the BLS, using random digit…

  15. Reducing Mission Costs by Leveraging Previous Investments in Space

    NASA Technical Reports Server (NTRS)

    Miller, Ron; Adams, W. James

    1999-01-01

    The Rapid Spacecraft Development Office (RSDO) at NASA's Goddard Space Flight Center has been charged with the responsibility to reduce mission cost by allowing access to previous developments on government and commercial space missions. RSDO accomplishes this responsibility by implementing two revolutionary contract vehicles, the Rapid Spacecraft Acquisition (RSA) and Quick Ride. This paper will describe the concept behind these contracts, the current capabilities available to missions, analysis of pricing trends to date using the RSDO processes, and future plans to increase flexibility and capabilities available to mission planners.

  16. A Bumpy Road and a Bridge too Far? An Analysis of the Realistic Bridging and Horizontal Construction Capabilities of the Canadian Military Engineers in the Force 2013 Structure

    DTIC Science & Technology

    2012-05-17

    the national security strategy? Liddell Hart describes this difficult challenge faced by all socially conservative states. They must find the type of... challenges and circumstances while working within an economic framework that necessitates choice.”23 TTCP describes the CBP process as a method that involves...a capability gap. Then the Army purchased quick solutions and fielded them for immediate deployment to theatre. Their generation satisfied an urgent

  17. Aircraft Survivability: Survivability Against Man Portable Air Defense Systems, Summer 2005

    DTIC Science & Technology

    2005-01-01

    Unsworth is the US Army’s Aviation Applied Technology Division (AATD) program manager for the quick-reaction capability AH–64A/D AN/AAR–57(V)3 Common...sionally under his mentorship. He will be greatly missed by all who knew him. Dr. Paul Tanenbaum named Director of the Survivability/Lethality Analysis...Directorate (SLAD) Dr. Paul Tanenbaum has been appointed director of the Survivability/Lethality Analysis Directorate (SLAD) of the US Army

  18. Relationship of the Basic Attributes Test to Tactical Reconnaissance Pilot Performance

    DTIC Science & Technology

    1987-01-01

    ulysk 36 4.. Pscoa- Test: Pefcrman Regression Analysis 66 5. Decsison Making Speed: Ped~cnance Regrssiop Analysis 68 6. Item Recognitio . Pefixmanc...agreement between 12 TRS and 91 TRS supcrvisors. This indicated that those most likely to be faced with the task of determining the performance capabilities...those UPT check flights requiring quick, consistent, and accurate responses. Item Recognitio Test Ihe item recognition test reduced to seven scors. Thet

  19. QUICK - AN INTERACTIVE SOFTWARE ENVIRONMENT FOR ENGINEERING DESIGN

    NASA Technical Reports Server (NTRS)

    Schlaifer, R. S.

    1994-01-01

    QUICK provides the computer user with the facilities of a sophisticated desk calculator which can perform scalar, vector and matrix arithmetic, propagate conic orbits, determine planetary and satellite coordinates and perform other related astrodynamic calculations within a Fortran-like environment. QUICK is an interpreter, therefore eliminating the need to use a compiler or a linker to run QUICK code. QUICK capabilities include options for automated printing of results, the ability to submit operating system commands on some systems, and access to a plotting package (MASL)and a text editor without leaving QUICK. Mathematical and programming features of QUICK include the ability to handle arbitrary algebraic expressions, the capability to define user functions in terms of other functions, built-in constants such as pi, direct access to useful COMMON areas, matrix capabilities, extensive use of double precision calculations, and the ability to automatically load user functions from a standard library. The MASL (The Multi-mission Analysis Software Library) plotting package, included in the QUICK package, is a set of FORTRAN 77 compatible subroutines designed to facilitate the plotting of engineering data by allowing programmers to write plotting device independent applications. Its universality lies in the number of plotting devices it puts at the user's disposal. The MASL package of routines has proved very useful and easy to work with, yielding good plots for most new users on the first or second try. The functions provided include routines for creating histograms, "wire mesh" surface plots and contour plots as well as normal graphs with a large variety of axis types. The library has routines for plotting on cartesian, polar, log, mercator, cyclic, calendar, and stereographic axes, and for performing automatic or explicit scaling. The lengths of the axes of a plot are completely under the control of the program using the library. Programs written to use the MASL subroutines can be made to output to the Calcomp 1055 plotter, the Hewlett-Packard 2648 graphics terminal, the HP 7221, 7475 and 7550 pen plotters, the Tektronix 40xx and 41xx series graphics terminals, the DEC VT125/VT240 graphics terminals, the QMS 800 laser printer, the Sun Microsystems monochrome display, the Ridge Computers monochrome display, the IBM/PC color display, or a "dumb" terminal or printer. Programs using this library can be written so that they always use the same type of plotter or they can allow the choice of plotter type to be deferred until after program execution. QUICK is written in RATFOR for use on Sun4 series computers running SunOS. No source code is provided. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in ASCII format is included on the distribution medium. QUICK was developed in 1991 and is a copyrighted work with all copyright vested in NASA.

  20. The StarView intelligent query mechanism

    NASA Technical Reports Server (NTRS)

    Semmel, R. D.; Silberberg, D. P.

    1993-01-01

    The StarView interface is being developed to facilitate the retrieval of scientific and engineering data produced by the Hubble Space Telescope. While predefined screens in the interface can be used to specify many common requests, ad hoc requests require a dynamic query formulation capability. Unfortunately, logical level knowledge is too sparse to support this capability. In particular, essential formulation knowledge is lost when the domain of interest is mapped to a set of database relation schemas. Thus, a system known as QUICK has been developed that uses conceptual design knowledge to facilitate query formulation. By heuristically determining strongly associated objects at the conceptual level, QUICK is able to formulate semantically reasonable queries in response to high-level requests that specify only attributes of interest. Moreover, by exploiting constraint knowledge in the conceptual design, QUICK assures that queries are formulated quickly and will execute efficiently.

  1. A PC-based telemetry system for acquiring and reducing data from multiple PCM streams

    NASA Astrophysics Data System (ADS)

    Simms, D. A.; Butterfield, C. P.

    1991-07-01

    The Solar Energy Research Institute's (SERI) Wind Research Program is using Pulse Code Modulation (PCM) Telemetry Data-Acquisition Systems to study horizontal-axis wind turbines. Many PCM systems are combined for use in test installations that require accurate measurements from a variety of different locations. SERI has found them ideal for data-acquisition from multiple wind turbines and meteorological towers in wind parks. A major problem has been in providing the capability to quickly combine and examine incoming data from multiple PCM sources in the field. To solve this problem, SERI has developed a low-cost PC-based PCM Telemetry Data-Reduction System (PC-PCM System) to facilitate quick, in-the-field multiple-channel data analysis. The PC-PCM System consists of two basic components. First, PC-compatible hardware boards are used to decode and combine multiple PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for use under DOS was developed to simplify data-acquisition control and management. The software, called the Quick-Look Data Management Program, provides a quick, easy-to-use interface between the PC and multiple PCM data streams. The Quick-Look Data Management Program is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. The paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data from multiple PCM streams. Also discussed are problems and techniques associated with PC-based telemetry data-acquisition, processing, and real-time display.

  2. Analyzing Feedback Control Systems

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    Interactive controls analysis (INCA) program developed to provide user-friendly environment for design and analysis of linear control systems, primarily feedback control. Designed for use with both small- and large-order systems. Using interactive-graphics capability, INCA user quickly plots root locus, frequency response, or time response of either continuous-time system or sampled-data system. Configuration and parameters easily changed, allowing user to design compensation networks and perform sensitivity analyses in very convenient manner. Written in Pascal and FORTRAN.

  3. Component-Level Electronic-Assembly Repair (CLEAR) Spacecraft Circuit Diagnostics by Analog and Complex Signature Analysis

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Wade, Raymond P.; Izadnegahdar, Alain

    2011-01-01

    The Component-Level Electronic-Assembly Repair (CLEAR) project at the NASA Glenn Research Center is aimed at developing technologies that will enable space-flight crews to perform in situ component-level repair of electronics on Moon and Mars outposts, where there is no existing infrastructure for logistics spares. These technologies must provide effective repair capabilities yet meet the payload and operational constraints of space facilities. Effective repair depends on a diagnostic capability that is versatile but easy to use by crew members that have limited training in electronics. CLEAR studied two techniques that involve extensive precharacterization of "known good" circuits to produce graphical signatures that provide an easy-to-use comparison method to quickly identify faulty components. Analog Signature Analysis (ASA) allows relatively rapid diagnostics of complex electronics by technicians with limited experience. Because of frequency limits and the growing dependence on broadband technologies, ASA must be augmented with other capabilities. To meet this challenge while preserving ease of use, CLEAR proposed an alternative called Complex Signature Analysis (CSA). Tests of ASA and CSA were used to compare capabilities and to determine if the techniques provided an overlapping or complementary capability. The results showed that the methods are complementary.

  4. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    DOE PAGES

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less

  5. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    PubMed Central

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2014-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205

  6. Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.

    1999-01-01

    As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.

  7. Reusable Launch Vehicle (RLV) Market Analysis Model

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.

    1999-01-01

    The RLV Market Analysis model is at best a rough order approximation of actual market behavior. However, it does give a quick indication if the flights exists to enable an economically viable RLV, and the assumptions necessary for the vehicle to capture those flights. Additional analysis, market research, and updating with the latest information on payloads and launches would improve the model. Plans are to update the model as new information becomes available and new requirements are levied. This tool will continue to be a vital part of NASA's RLV business analysis capability for the foreseeable future.

  8. Design of a Lunar Quick-Attach Mechanism to Hummer Vehicle Mounting Interface

    NASA Technical Reports Server (NTRS)

    Grismore, David A.

    2010-01-01

    This report presents my work experiences while I was an intern with NASA (National Aeronautic and Space Administration) in the Spring of2010 at the Kennedy Space Center (KSC) launch facility in Cape Canaveral, Florida as a member of the NASA USRP (Undergraduate Student Research Program) program. I worked in the Surface Systems (NE-S) group during the internship. Within NE-S, two ASRC (Arctic Slope Regional Corporation) contract engineers, A.J. Nick and Jason Schuler, had developed a "Quick-Attach" mechanism for the Chariot Rover, the next generation lunar rover. My project was to design, analyze, and possibly fabricate a mounting interface between their "Quick-Attach" and a Hummer vehicle. This interface was needed because it would increase their capabilities to test the Quick Attach and its various attachments, as they do not have access to a Chariot Rover at KSC. I utilized both Pro Engineer, a 3D CAD software package, and a Coordinate Measuring Machine (CMM) known as a FAROarm to collect data and create my design. I relied on hand calculations and the Mechanica analysis tool within Pro Engineer to perform stress analysis on the design. After finishing the design, I began working on creating professional level CAD drawings and issuing them into the KSC design database known as DDMS before the end of the internship.

  9. Rapid Acquisition of Army Command and Control Systems

    DTIC Science & Technology

    2014-01-01

    Research and Engineering (Plans and Programs). 63 Glenn Fogg , “How to Better Support the Need for Quick Reaction...Pocket,” Army Communicator, Summer 2005. Fogg , Glenn, “How to Better Support the Need for Quick Reaction Capabilities in an Irregular Warfare

  10. CAA Annual Report, Fiscal Year 1992.

    DTIC Science & Technology

    1992-11-01

    But, in t• -. y’s era of rapid change, there has been a burgeoning demand for quick reaction analyses. Today, CAA increasingly applies the results of...years. The graph on the right illustrates the reorientation of CAA analytical focus to meet increasing sponsor demands for QRA and the apparent...meeting the most important analysis needs of the Army while maintaining quality and preparing CAA capabilities to meet demands that will be presented

  11. Prosthetic Hand For Holding Rods, Tools, And Handles

    NASA Technical Reports Server (NTRS)

    Belcher, Jewell G., Jr.; Vest, Thomas W.

    1995-01-01

    Prosthetic hand with quick-grip/quick-release lever broadens range of specialized functions available to lower-arm amputee by providing improved capabilities for gripping rods, tools, handles, and like. Includes two stationary lower fingers opposed by one pivoting upper finger. Lever operates in conjunction with attached bracket.

  12. Design of weak link channel-cut crystals for fast QEXAFS monochromators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polheim, O. von, E-mail: vonpolheim@uni-wuppertal.de; Müller, O.; Lützenkirchen-Hecht, D.

    2016-07-27

    A weak link channel-cut crystal, optimized for dedicated Quick EXAFS monochromators and measurements, was designed using finite element analysis. This channel-cut crystal offers precise detuning capabilities to enable suppression of higher harmonics in the virtually monochromatic beam. It was optimized to keep the detuning stable, withstanding the mechanical load, which occurs during oscillations with up to 50 Hz. First tests at DELTA (Dortmund, Germany), proved the design.

  13. Military Operations Research. Winter 1996. Volume 1, Number 4

    DTIC Science & Technology

    1996-01-01

    ANALYSIS DISTURBANCE INPUT OUTPUT PLANT SEMANTIC CONTROL SYSTEM CONTROL DESIGNER CONTROL I i LAW SYSTEM GOAL CONTROL IIDENTIFIER SELECTOR ADAPTER CONTRO...analysts for many years. It is designed to provide a quick reference for models that represent the effects of a conventional attack against ground...satellites offer this capability. This poses the additional challenge as to how many highways one can "see" per unit time. He did, however, design a

  14. Enabling Rapid and Robust Structural Analysis During Conceptual Design

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.; Padula, Sharon L.; Li, Wu

    2015-01-01

    This paper describes a multi-year effort to add a structural analysis subprocess to a supersonic aircraft conceptual design process. The desired capabilities include parametric geometry, automatic finite element mesh generation, static and aeroelastic analysis, and structural sizing. The paper discusses implementation details of the new subprocess, captures lessons learned, and suggests future improvements. The subprocess quickly compares concepts and robustly handles large changes in wing or fuselage geometry. The subprocess can rank concepts with regard to their structural feasibility and can identify promising regions of the design space. The automated structural analysis subprocess is deemed robust and rapid enough to be included in multidisciplinary conceptual design and optimization studies.

  15. Automated document analysis system

    NASA Astrophysics Data System (ADS)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  16. ARDS User Manual

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    2001-01-01

    Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.

  17. New Tools for New Missions - Unmanned Aircraft Systems Offer Exciting Capabilities

    NASA Astrophysics Data System (ADS)

    Bland, G.; Miles, T.; Pieri, D. C.; Coronado, P. L.; Fladeland, M. M.; Diaz, J. A.; Cione, J.; Maslanik, J. A.; Roman, M. O.; de Boer, G.; Argrow, B. M.; Novara, J.; Stachura, M.; Neal, D.; Moisan, J. R.

    2015-12-01

    There are numerous emerging possibilities for utilizing unmanned aircraft systems (UAS) to investigate a variety of natural hazards, both for prediction and analysis of specific events. Additionally, quick response capabilities will provide affordable, low risk support for emergency management teams. NASA's partnerships with commercial, university and other government agency teams are bringing new capabilities to research and emergency management communities. New technology platforms and instrument systems are gaining momentum for stand-off remote sensing observations, as well as penetration and detailed in-situ examination of natural and anthropogenic phenomena. Several pioneering investigations have provided the foundation for this development, including NASA projects with Aerosonde, Dragon Eye, and SIERRA platforms. With miniaturized instrument and platform technologies, these experiments demonstrated that previously unobtainable observations may significantly aid in the understanding, prediction, and assessment of natural hazards such as storms, volcanic eruptions, floods, and the potential impact of environmental changes. Remote sensing observations of storms and fires have also been successfully demonstrated through NASA's efforts with larger UAS such as the Global Hawk and Ikhana platforms. The future may unfold with new high altitude and/or long endurance capabilities, in some cases with less size and costs as payload capacity requirements are reduced through further miniaturization, and alternatively with expanded instrumentation and mission profiles. Several new platforms and instrument development projects are underway that will enable affordable, quick response observations. Additionally, distributed measurements that will provide near-simultaneous coverage at multiple locations will be possible - an exciting new mission concept that will greatly aid many observation scenarios. Partnerships with industry, academia, and other government agencies are all making significant contributions to these new capabilities.

  18. Open Source Next Generation Visualization Software for Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  19. Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Sabey, Nickolas J.

    2014-01-01

    The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.

  20. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less

  1. Aviation System Analysis Capability Quick Response System Report Server User’s Guide.

    DTIC Science & Technology

    1996-10-01

    primary data sources for the QRS Report Server are the following: ♦ United States Department of Transportation airline service quality per- formance...and to cross-reference sections of this document. is used to indicate quoted text messages from WWW pages. is used for WWW page and section titles...would link the user to another document or another section of the same document. ALL CAPS is used to indicate Report Server variables for which the

  2. Comparative evaluation of NeQuick and IRI models over Polar Regions

    NASA Astrophysics Data System (ADS)

    Pietrella, Marco; Nava, Bruno; Pezzopane, Michael; Migoya-Orue, Yenca; Scotto, Carlo

    2016-04-01

    In the framework of the AUSPICIO (AUtomatic Scaling of Polar Ionograms and Cooperative Ionospheric Observations) project, the ionograms recorded at Hobart (middle latitude), Macquarie Island, Livingstone Island and Comandante Ferraz (middle-high latitude) and those recorded at the ionospheric observatories of Casey, Mawson, Davis, and Scott Base (Antarctic Polar Circle), have been taken into account to study the capability of NeQuick-2 and IRI-2012 models in predicting the behavior of the ionosphere, mainly in the polar region. In particular, the applicability of NeQuick-2 and IRI-2012 models was evaluated under two different modes: a) as assimilative models ingesting the foF2 and hmF2 measurements obtained from the electron density profiles provided by the Adaptive Ionospheric Profiler (AIP); b) as climatological models taking as input F10.7 solar activity index. The results obtained from the large number of comparisons made for each ionospheric observatory when NeQuick-2 and IRI-2012 models work according to the two modes above mentioned, reveal that the best description of the ionosphere electron density at the polar regions is provided when peak parameter data are ingested in near-real-time into NeQuick-2 and IRI-2012 models which, indeed, are not always able to represent efficiently the behavior of the ionosphere over the polar regions when operating in long term prediction mode. The statistical analysis results expressed in terms of root mean square errors (r.m.s.e.) for each ionospheric observatory show that, outside the Antarctic Polar Circle (APC), NeQuick-2 performance is better than the IRI-2012 performance; on the contrary, inside the APC IRI-2012 model performs better than NeQuick-2.

  3. Evaluation of direct analysis in real time for the determination of highly polar pesticides in lettuce and celery using modified Quick Polar Pesticides Extraction method.

    PubMed

    Lara, Francisco J; Chan, Danny; Dickinson, Michael; Lloyd, Antony S; Adams, Stuart J

    2017-05-05

    Direct analysis in real time (DART) was evaluated for the determination of a number of highly polar pesticides using the Quick Polar Pesticides Extraction (QuPPe) method. DART was hyphenated to high resolution mass spectrometry (HRMS) in order to get the required selectivity that allows the determination of these compounds in complex samples such as lettuce and celery. Experimental parameters such as desorption temperature, scanning speed, and distances between the DART ion source and MS inlet were optimized. Two different mass analyzers (Orbitrap and QTOF) and two accessories for sample introduction (Dip-it ® tips and QuickStrip™ sample cards) were evaluated. An extra clean-up step using primary-secondary amine (PSA) was included in the QuPPe method to improve sensitivity. The main limitation found was in-source fragmentation, nevertheless QuPPe-DART-HRMS proved to be a fast and reliable tool with quantitative capabilities for at least seven compounds: amitrole, cyromazine, propamocarb, melamine, diethanolamine, triethanolamine and 1,2,4-triazole. The limits of detection ranged from 20 to 60μg/kg. Recoveries for fortified samples ranged from 71 to 115%, with relative standard deviations <18%. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Dynamic analysis of gas-core reactor system

    NASA Technical Reports Server (NTRS)

    Turner, K. H., Jr.

    1973-01-01

    A heat transfer analysis was incorporated into a previously developed model CODYN to obtain a model of open-cycle gaseous core reactor dynamics which can predict the heat flux at the cavity wall. The resulting model was used to study the sensitivity of the model to the value of the reactivity coefficients and to determine the system response for twenty specified perturbations. In addition, the model was used to study the effectiveness of several control systems in controlling the reactor. It was concluded that control drums located in the moderator region capable of inserting reactivity quickly provided the best control.

  5. Remote Sensing Operational Capabilities

    DTIC Science & Technology

    1999-10-01

    systems. In each of the cases orbital and sensor characteristics were modeled , as was the possible impact of weather over target areas. In each of the...collect the desired information quickly, it is imperative that the satellite be capable of accessing the target area frequently. • Flexibility and...be capable of accessing the target area frequently. 66 • Flexibility and speed in tasking: the system should be capable of collecting data with a

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  7. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  8. Quick connect fastener

    NASA Technical Reports Server (NTRS)

    Weddendorf, Bruce (Inventor)

    1994-01-01

    A quick connect fastener and method of use is presented wherein the quick connect fastener is suitable for replacing available bolts and screws, the quick connect fastener being capable of installation by simply pushing a threaded portion of the connector into a member receptacle hole, the inventive apparatus being comprised of an externally threaded fastener having a threaded portion slidably mounted upon a stud or bolt shaft, wherein the externally threaded fastener portion is expandable by a preloaded spring member. The fastener, upon contact with the member receptacle hole, has the capacity of presenting cylindrical threads of a reduced diameter for insertion purposes and once inserted into the receiving threads of the receptacle member hole, are expandable for engagement of the receptacle hole threads forming a quick connect of the fastener and the member to be fastened, the quick connect fastener can be further secured by rotation after insertion, even to the point of locking engagement, the quick connect fastener being disengagable only by reverse rotation of the mated thread engagement.

  9. Aqua-Aura QuickDAM (QDAM) 2.0 Ops Concept

    NASA Technical Reports Server (NTRS)

    Nidhiry, John

    2015-01-01

    The presentation describes the Quick Debris Avoidance Maneuver (QDAM) 2.0 process used the Aqua and Aura flight teams to (a) reduce the work load and dependency on staff and systems; (b) reduce turn-around time and provide emergency last minute capabilities; and (c) increase burn parameter flexibility. The presentation also compares the QDAM 2.0 process to previous approaches.

  10. 46 CFR 45.195 - Additional equipment requirements for the Muskegon route.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the vessel must be capable of using the communication systems. (b) Cutting gear. Equipment that can quickly cut the towline at the towing vessel. The cutting gear must be in operable condition and... aboard the vessel must be capable of using the cutting gear. ...

  11. Smart Wire Grid: Resisting Expectations

    ScienceCinema

    Ramsay, Stewart; Lowe, DeJim

    2018-05-30

    Smart Wire Grid's DSR technology (Discrete Series Reactor) can be quickly deployed on electrical transmission lines to create intelligent mesh networks capable of quickly rerouting electricity to get power where and when it's needed the most. With their recent ARPA-E funding, Smart Wire Grid has been able to move from prototype and field testing to building out a US manufacturing operation in just under a year.

  12. Virtual endoscopy using spherical QuickTime-VR panorama views.

    PubMed

    Tiede, Ulf; von Sternberg-Gospos, Norman; Steiner, Paul; Höhne, Karl Heinz

    2002-01-01

    Virtual endoscopy needs some precomputation of the data (segmentation, path finding) before the diagnostic process can take place. We propose a method that precomputes multinode spherical panorama movies using Quick-Time VR. This technique allows almost the same navigation and visualization capabilities as a real endoscopic procedure, a significant reduction of interaction input is achieved and the movie represents a document of the procedure.

  13. Portable laser-induced breakdown spectroscopy/diffuse reflectance hybrid spectrometer for analysis of inorganic pigments

    NASA Astrophysics Data System (ADS)

    Siozos, Panagiotis; Philippidis, Aggelos; Anglos, Demetrios

    2017-11-01

    A novel, portable spectrometer, combining two analytical techniques, laser-induced breakdown spectroscopy (LIBS) and diffuse reflectance spectroscopy, was developed with the aim to provide an enhanced instrumental and methodological approach with regard to the analysis of pigments in objects of cultural heritage. Technical details about the hybrid spectrometer and its operation are presented and examples are given relevant to the analysis of paint materials. Both LIBS and diffuse reflectance spectra in the visible and part of the near infrared, corresponding to several neat mineral pigment samples, were recorded and the complementary information was used to effectively distinguish different types of pigments even if they had similar colour or elemental composition. The spectrometer was also employed in the analysis of different paints on the surface of an ancient pottery sherd demonstrating the capabilities of the proposed hybrid diagnostic approach. Despite its instrumental simplicity and compact size, the spectrometer is capable of supporting analytical campaigns relevant to archaeological, historical or art historical investigations, particularly when quick data acquisition is required in the context of surveys of large numbers of objects and samples.

  14. Experiences with Text Mining Large Collections of Unstructured Systems Development Artifacts at JPL

    NASA Technical Reports Server (NTRS)

    Port, Dan; Nikora, Allen; Hihn, Jairus; Huang, LiGuo

    2011-01-01

    Often repositories of systems engineering artifacts at NASA's Jet Propulsion Laboratory (JPL) are so large and poorly structured that they have outgrown our capability to effectively manually process their contents to extract useful information. Sophisticated text mining methods and tools seem a quick, low-effort approach to automating our limited manual efforts. Our experiences of exploring such methods mainly in three areas including historical risk analysis, defect identification based on requirements analysis, and over-time analysis of system anomalies at JPL, have shown that obtaining useful results requires substantial unanticipated efforts - from preprocessing the data to transforming the output for practical applications. We have not observed any quick 'wins' or realized benefit from short-term effort avoidance through automation in this area. Surprisingly we have realized a number of unexpected long-term benefits from the process of applying text mining to our repositories. This paper elaborates some of these benefits and our important lessons learned from the process of preparing and applying text mining to large unstructured system artifacts at JPL aiming to benefit future TM applications in similar problem domains and also in hope for being extended to broader areas of applications.

  15. Quick clay and landslides of clayey soils.

    PubMed

    Khaldoun, Asmae; Moller, Peder; Fall, Abdoulaye; Wegdam, Gerard; De Leeuw, Bert; Méheust, Yves; Otto Fossum, Jon; Bonn, Daniel

    2009-10-30

    We study the rheology of quick clay, an unstable soil responsible for many landslides. We show that above a critical stress the material starts flowing abruptly with a very large viscosity decrease caused by the flow. This leads to avalanche behavior that accounts for the instability of quick clay soils. Reproducing landslides on a small scale in the laboratory shows that an additional factor that determines the violence of the slides is the inhomogeneity of the flow. We propose a simple yield stress model capable of reproducing the laboratory landslide data, allowing us to relate landslides to the measured rheology.

  16. Quick Clay and Landslides of Clayey Soils

    NASA Astrophysics Data System (ADS)

    Khaldoun, Asmae; Moller, Peder; Fall, Abdoulaye; Wegdam, Gerard; de Leeuw, Bert; Méheust, Yves; Otto Fossum, Jon; Bonn, Daniel

    2009-10-01

    We study the rheology of quick clay, an unstable soil responsible for many landslides. We show that above a critical stress the material starts flowing abruptly with a very large viscosity decrease caused by the flow. This leads to avalanche behavior that accounts for the instability of quick clay soils. Reproducing landslides on a small scale in the laboratory shows that an additional factor that determines the violence of the slides is the inhomogeneity of the flow. We propose a simple yield stress model capable of reproducing the laboratory landslide data, allowing us to relate landslides to the measured rheology.

  17. [A quickly methodology for drug intelligence using profiling of illicit heroin samples].

    PubMed

    Zhang, Jianxin; Chen, Cunyi

    2012-07-01

    The aim of the paper was to evaluate a link between two heroin seizures using a descriptive method. The system involved the derivation and gas chromatographic separation of samples followed by a fully automatic data analysis and transfer to a database. Comparisons used the square cosine function between two chromatograms assimilated to vectors. The method showed good discriminatory capabilities. The probability of false positives was extremely slight. In conclusion, this method proved to be efficient and reliable, which appeared suitable for estimating the links between illicit heroin samples.

  18. A DNA sequence analysis package for the IBM personal computer.

    PubMed Central

    Lagrimini, L M; Brentano, S T; Donelson, J E

    1984-01-01

    We present here a collection of DNA sequence analysis programs, called "PC Sequence" (PCS), which are designed to run on the IBM Personal Computer (PC). These programs are written in IBM PC compiled BASIC and take full advantage of the IBM PC's speed, error handling, and graphics capabilities. For a modest initial expense in hardware any laboratory can use these programs to quickly perform computer analysis on DNA sequences. They are written with the novice user in mind and require very little training or previous experience with computers. Also provided are a text editing program for creating and modifying DNA sequence files and a communications program which enables the PC to communicate with and collect information from mainframe computers and DNA sequence databases. PMID:6546433

  19. Preserving The UK-US Special Relationship: A Tactically Capable And Interoperable Royal Air Force In 2036

    DTIC Science & Technology

    2016-06-01

    discussed. Finally, the paper provides a brief survey of doctrinal deficiencies, highlights the importance of enhancing distributed synthetic training...adversaries, such as China , have increasingly modern AU/ACSC/Radley J/AY16 4 and capable military capabilities that, in the event of a conflict, will...is that Blue Forces, despite superb situational awareness, will quickly run short of fuel and missiles against a large- force aggressor. Having

  20. ISPAN (Interactive Stiffened Panel Analysis): A tool for quick concept evaluation and design trade studies

    NASA Technical Reports Server (NTRS)

    Hairr, John W.; Dorris, William J.; Ingram, J. Edward; Shah, Bharat M.

    1993-01-01

    Interactive Stiffened Panel Analysis (ISPAN) modules, written in FORTRAN, were developed to provide an easy to use tool for creating finite element models of composite material stiffened panels. The modules allow the user to interactively construct, solve and post-process finite element models of four general types of structural panel configurations using only the panel dimensions and properties as input data. Linear, buckling and post-buckling solution capability is provided. This interactive input allows rapid model generation and solution by non finite element users. The results of a parametric study of a blade stiffened panel are presented to demonstrate the usefulness of the ISPAN modules. Also, a non-linear analysis of a test panel was conducted and the results compared to measured data and previous correlation analysis.

  1. Intelligent hand-portable proliferation sensing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dieckman, S.L.; Bostrom, G.A.; Waterfield, L.G.

    1997-08-01

    Argonne National Laboratory, with support from DOE`s Office of Nonproliferation and National Security, is currently developing an intelligent hand-portable sensor system. This system is designed specifically to support the intelligence community with the task of in-field sensing of nuclear proliferation and related activities. Based upon pulsed laser photo-ionization time-of-flight mass spectrometry technology, this novel sensing system is capable of quickly providing a molecular or atomic analysis of specimens. The system is capable of analyzing virtually any gas phase molecule, or molecule that can be induced into the gas phase by (for example) sample heating. This system has the unique advantagesmore » of providing unprecedented portability, excellent sensitivity, tremendous fieldability, and a high performance/cost ratio. The system will be capable of operating in a highly automated manner for on-site inspections, and easily modified for other applications such as perimeter monitoring aboard a plane or drone. The paper describes the sensing system.« less

  2. Development of an ultrasonic linear motor with ultra-positioning capability and four driving feet.

    PubMed

    Zhu, Cong; Chu, Xiangcheng; Yuan, Songmei; Zhong, Zuojin; Zhao, Yanqiang; Gao, Shuning

    2016-12-01

    This paper presents a novel linear piezoelectric motor which is suitable for rapid ultra-precision positioning. The finite element analysis (FEA) was applied for optimal design and further analysis, then experiments were conducted to investigate its performance. By changing the input signal, the proposed motor was found capable of working in the fast driving mode as well as in the precision positioning mode. When working in the fast driving mode, the motor acts as an ultrasonic motor with maximum no-load speed up to 181.2mm/s and maximum thrust of 1.7N at 200Vp-p. Also, when working in precision positioning mode, the motor can be regarded as a flexible hinge piezoelectric actuator with arbitrary motion in the range of 8μm. The measurable minimum output displacement was found to be 0.08μm, but theoretically, can be even smaller. More importantly, the motor can be quickly and accurately positioned in a large stroke. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. A low-cost PC-based telemetry data-reduction system

    NASA Astrophysics Data System (ADS)

    Simms, D. A.; Butterfield, C. P.

    1990-04-01

    The Solar Energy Research Institute's (SERI) Wind Research Branch is using Pulse Code Modulation (PCM) telemetry data-acquisition systems to study horizontal-axis wind turbines. PCM telemetry systems are used in test installations that require accurate multiple-channel measurements taken from a variety of different locations. SERI has found them ideal for use in tests requiring concurrent acquisition of data-reduction system to facilitate quick, in-the-field multiple-channel data analysis. Called the PC-PCM System, it consists of two basic components. First, AT-compatible hardware boards are used for decoding and combining PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for the DOS operating system was developed to simplify data-acquisition control and management. The software provides a quick, easy-to-use interface between the PC and PCM data streams. Called the Quick-Look Data Management Program, it is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. This paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data. Also discussed are problems and techniques associated with PC-based telemetry data acquisition, processing, and real-time display.

  4. Carcinogen File: The Ames Test.

    ERIC Educational Resources Information Center

    Kendall, Jim; Kriebel, David

    1979-01-01

    This test measures the capability of a chemical substance to cause mutations in special strains of the bacterium Salmonella. It is quick, taking only forty-eight hours, inexpensive, and reliable. (BB)

  5. Simulating Daily and Sub-daily Water Flow in Large, Semi-arid Watershed Using SWAT: A Case Study of Nueces River Basin, Texas

    NASA Astrophysics Data System (ADS)

    Bassam, S.; Ren, J.

    2015-12-01

    Runoff generated during heavy rainfall imposes quick, but often intense, changes in the flow of streams, which increase the chance of flash floods in the vicinity of the streams. Understanding the temporal response of streams to heavy rainfall requires a hydrological model that considers meteorological, hydrological, and geological components of the streams and their watersheds. SWAT is a physically-based, semi-distributed model that is capable of simulating water flow within watersheds with both long-term, i.e. annually and monthly, and short-term (daily and sub-daily) time scales. However, the capability of SWAT in sub-daily water flow modeling within large watersheds has not been studied much, compare to long-term and daily time scales. In this study we are investigating the water flow in a large, semi-arid watershed, Nueces River Basin (NRB) with the drainage area of 16950 mi2 located in South Texas, with daily and sub-daily time scales. The objectives of this study are: (1) simulating the response of streams to heavy, and often quick, rainfall, (2) evaluating SWAT performance in sub-daily modeling of water flow within a large watershed, and (3) examining means for model performance improvement during model calibration and verification based on results of sensitivity and uncertainty analysis. The results of this study can provide important information for water resources planning during flood seasons.

  6. GADRAS-DRF 18.6 User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, Steve M.; Thoreson, Greg G.; Theisen, Lisa A.

    2016-05-01

    The Gamma Detector Response and Analysis Software–Detector Response Function (GADRAS-DRF) application computes the response of gamma-ray and neutron detectors to incoming radiation. This manual provides step-by-step procedures to acquaint new users with the use of the application. The capabilities include characterization of detector response parameters, plotting and viewing measured and computed spectra, analyzing spectra to identify isotopes, and estimating source energy distributions from measured spectra. GADRAS-DRF can compute and provide detector responses quickly and accurately, giving users the ability to obtain usable results in a timely manner (a matter of seconds or minutes).

  7. Utility of a scanning densitometer in analyzing remotely sensed imagery

    NASA Technical Reports Server (NTRS)

    Dooley, J. T.

    1976-01-01

    The utility of a scanning densitometer for analyzing imagery in the NASA Lewis Research Center's regional remote sensing program was evaluated. Uses studied include: (1) quick-look screening of imagery by means of density slicing, magnification, color coding, and edge enhancement; (2) preliminary category classification of both low- and high-resolution data bases; and (3) quantitative measurement of the extent of features within selected areas. The densitometer was capable of providing fast, convenient, and relatively inexpensive preliminary analysis of aerial and satellite photography and scanner imagery involving land cover, water quality, strip mining, and energy conservation.

  8. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2011-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  9. Overview of TPS Tasks

    NASA Technical Reports Server (NTRS)

    Johnson, Sylvia M.

    2000-01-01

    The objectives of the project summarized in this viewgraph presentation are the following: (1) Develop a lightweight and low cost durable Thermal Protection System (TPS) for easy application to reusable launch vehicle payload launchers; (2) Develop quickly processed composite TPS processing and repair techniques; and (3) Develop higher temperature capability tile TPS. The benefits of this technology include reduced installation and operations cost, enhanced payload capability resulting from TPS weight reduction, and enhanced flight envelope and performance resulting from higher temperature capability TPS which can result in improved safety.

  10. QuickStrike ASOC Battlefield Simulation: Preparing the War Fighter to Win

    NASA Technical Reports Server (NTRS)

    Jones, Richard L.

    2010-01-01

    The QuickStrike ASOC (Air Support Operations Center) Battlefield Simulation fills a crucial gap in USAF and United Kingdom Close Air Support (CAS) and airspace manager training. The system now provides six squadrons with the capability to conduct total-mission training events whenever the personnel and time are available. When the 111th ASOC returned from their first deployment to Afghanistan they realized the training available prior to deployment was inadequate. They sought an organic training capability focused on the ASOC mission that was low cost, simple to use, adaptable, and available now. Using a commercial off-the-shelf simulation, they developed a complete training system by adapting the simulation to their training needs. Through more than two years of spiral development, incorporating lessons learned, the system has matured, and can now realistically replicate the Tactical Operations Center (TOC) in Kabul, Afghanistan, the TOC supporting the mission in Iraq, or can expand to support a major conflict scenario. The training system provides a collaborative workspace for the training audience and exercise control group via integrated software and workstations that can easily adapt to new mission reqUirements and TOC configurations. The system continues to mature. Based on inputs from the war fighter, new capabilities have been incorporated to add realism and simplify the scenario development process. The QuickStrike simulation can now import TBMCS Air Tasking Order air mission data and can provide air and ground tracks to a common operating picture; presented through either C2PC or JADOCS. This oranic capability to practice team processes and tasks and to conduct mission rehearsals proved its value in the 111 h ASOS's next deployment. The ease of scenario development and the simple to learn and intuitive gamelike interface enables the squadrons to develop and share scenarios incorporating lessons learned from every deployment. These war fighters have now filled the training gap and have the capability they need to train to win.

  11. Hybrid methods for cybersecurity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less

  12. ERLN Lab Compendium Fact Sheet

    EPA Pesticide Factsheets

    The Compendium is an online database of environmental testing laboratories nationwide. It enables labs to create profiles of their capabilities, so emergency responders can quickly identify a lab that will meet their support needs.

  13. The Revolution's on Your Desk.

    ERIC Educational Resources Information Center

    Erickson, Bruce

    1987-01-01

    The use of desktop publishing at schools, colleges, and universities is described. The advantages include: shorter turnaround, reduced costs, more control and capability, and more fun. However, rapid change and quick obsolescence are inevitable. (MLW)

  14. Developmental Course of Impulsivity and Capability from Age 10 to Age 25 as Related to Trajectory of Suicide Attempt in a Community Cohort

    ERIC Educational Resources Information Center

    Kasen, Stephanie; Cohen, Patricia; Chen, Henian

    2011-01-01

    Hierarchical linear models were used to examine trajectories of impulsivity and capability between ages 10 and 25 in relation to suicide attempt in 770 youths followed longitudinally: intercepts were set at age 17. The impulsivity measure assessed features of urgency (e.g., poor control, quick provocation, and disregard for external constraints);…

  15. NASA's Dual-Fuel Airbreathing Hypersonic Vehicle Study

    NASA Technical Reports Server (NTRS)

    Hunt, James L.; Eiswirth, Edward A.

    1996-01-01

    A Mach 10 cruise vehicle provides a quick response, global reach capability with high survivability. For operations from CONUS, mission radii on the order of 8,000 nmi are sufficient. For missions which return to CONUS, a dual-fueled vehicle is superior, due to its capability to in-flight refuel. However, for one-way mission, an all-hydrogen vehicle is preferable because of its higher specific impulse.

  16. Introducing a New Software for Geodetic Analysis

    NASA Astrophysics Data System (ADS)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  17. Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.

    2000-01-01

    In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.

  18. Development of dog-like retrieving capability in a ground robot

    NASA Astrophysics Data System (ADS)

    MacKenzie, Douglas C.; Ashok, Rahul; Rehg, James M.; Witus, Gary

    2013-01-01

    This paper presents the Mobile Intelligence Team's approach to addressing the CANINE outdoor ground robot competition. The competition required developing a robot that provided retrieving capabilities similar to a dog, while operating fully autonomously in unstructured environments. The vision team consisted of Mobile Intelligence, the Georgia Institute of Technology, and Wayne State University. Important computer vision aspects of the project were the ability to quickly learn the distinguishing characteristics of novel objects, searching images for the object as the robot drove a search pattern, identifying people near the robot for safe operations, correctly identify the object among distractors, and localizing the object for retrieval. The classifier used to identify the objects will be discussed, including an analysis of its performance, and an overview of the entire system architecture presented. A discussion of the robot's performance in the competition will demonstrate the system's successes in real-world testing.

  19. BioenergyKDF: Enabling Spatiotemporal Data Synthesis and Research Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, Aaron T; Movva, Sunil; Karthik, Rajasekar

    2014-01-01

    The Bioenergy Knowledge Discovery Framework (BioenergyKDF) is a scalable, web-based collaborative environment for scientists working on bioenergy related research in which the connections between data, literature, and models can be explored and more clearly understood. The fully-operational and deployed system, built on multiple open source libraries and architectures, stores contributions from the community of practice and makes them easy to find, but that is just its base functionality. The BioenergyKDF provides a national spatiotemporal decision support capability that enables data sharing, analysis, modeling, and visualization as well as fosters the development and management of the U.S. bioenergy infrastructure, which ismore » an essential component of the national energy infrastructure. The BioenergyKDF is built on a flexible, customizable platform that can be extended to support the requirements of any user community especially those that work with spatiotemporal data. While there are several community data-sharing software platforms available, some developed and distributed by national governments, none of them have the full suite of capabilities available in BioenergyKDF. For example, this component-based platform and database independent architecture allows it to be quickly deployed to existing infrastructure and to connect to existing data repositories (spatial or otherwise). As new data, analysis, and features are added; the BioenergyKDF will help lead research and support decisions concerning bioenergy into the future, but will also enable the development and growth of additional communities of practice both inside and outside of the Department of Energy. These communities will be able to leverage the substantial investment the agency has made in the KDF platform to quickly stand up systems that are customized to their data and research needs.« less

  20. Test data, demonstration videos, and transceivers.

    DOT National Transportation Integrated Search

    2016-11-01

    Smart-driving technologies are evolving quickly and cover a wide range of capabilities. This report describes various test data, demonstrations, and transceivers created and used during the demonstration phase of Project 0-6838. Researchers at Southw...

  1. 29 CFR 1910.422 - Procedures during dive.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... applicable to each diving operation unless otherwise specified. (b) Water entry and exit. (1) A means capable... from a dive team member; (3) Communications are lost and can not be quickly re-established between the...

  2. 29 CFR 1910.422 - Procedures during dive.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... applicable to each diving operation unless otherwise specified. (b) Water entry and exit. (1) A means capable... from a dive team member; (3) Communications are lost and can not be quickly re-established between the...

  3. 29 CFR 1910.422 - Procedures during dive.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... applicable to each diving operation unless otherwise specified. (b) Water entry and exit. (1) A means capable... from a dive team member; (3) Communications are lost and can not be quickly re-established between the...

  4. 29 CFR 1910.422 - Procedures during dive.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... applicable to each diving operation unless otherwise specified. (b) Water entry and exit. (1) A means capable... from a dive team member; (3) Communications are lost and can not be quickly re-established between the...

  5. 29 CFR 1910.422 - Procedures during dive.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... applicable to each diving operation unless otherwise specified. (b) Water entry and exit. (1) A means capable... from a dive team member; (3) Communications are lost and can not be quickly re-established between the...

  6. An assessment of the process capabilities of nanoimprint lithography

    NASA Astrophysics Data System (ADS)

    Balla, Tobias; Spearing, S. Mark; Monk, Andrew

    2008-09-01

    Nanoimprint lithography (NIL) is an emerging nanofabrication tool, able to replicate imprint patterns quickly and at high volumes. The present study was performed in order to define the capabilities of NIL, based on a study of published research and to identify the application areas where NIL has the greatest potential. The process attributes of different NIL process chains were analysed, and their process capabilities were compared to identify trends and process limitations. The attributes chosen include the line width, relief height, initial resist thickness, residual layer thickness, imprint area and line width tolerances. In each case well-defined limits can be identified, which are a direct result of the mechanisms involved in the NIL process. These quantitative results were compared with the assessments of individuals in academia and within the microfabrication industry. The results suggest NIL is most suited to producing photonic, microfluidic and patterned media applications, with photonic applications the closest to market. NIL needs to address overlay alignment issues for wider use, while an analysis is needed for each market, as to whether NIL adds value.

  7. The Viking viewer for connectomics: scalable multi-user annotation and summarization of large volume data sets

    PubMed Central

    ANDERSON, JR; MOHAMMED, S; GRIMM, B; JONES, BW; KOSHEVOY, P; TASDIZEN, T; WHITAKER, R; MARC, RE

    2011-01-01

    Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. PMID:21118201

  8. Future remote-sensing programs

    NASA Technical Reports Server (NTRS)

    Schweickart, R. L.

    1975-01-01

    User requirements and methods developed to fulfill them are discussed. Quick-look data, data storage on computer-compatible tape, and an integrated capability for production of images from the whole class of earth-viewing satellites are among the new developments briefly described. The increased capability of LANDSAT-C and Nimbus G and the needs of specialized applications such as, urban land use planning, cartography, accurate measurement of small agricultural fields, thermal mapping and coastal zone management are examined. The affect of the space shuttle on remote sensing technology through increased capability is considered.

  9. Introducing a New Software for Geodetic Analysis

    NASA Astrophysics Data System (ADS)

    Hjelle, G. A.; Dähnn, M.; Fausk, I.; Kirkvik, A. S.; Mysen, E.

    2016-12-01

    At the Norwegian Mapping Authority, we are currently developing Where, a newsoftware for geodetic analysis. Where is built on our experiences with theGeosat software, and will be able to analyse and combine data from VLBI, SLR,GNSS and DORIS. The software is mainly written in Python which has proved veryfruitful. The code is quick to write and the architecture is easily extendableand maintainable. The Python community provides a rich eco-system of tools fordoing data-analysis, including effective data storage and powerfulvisualization. Python interfaces well with other languages so that we can easilyreuse existing, well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where,including benchmarks against other software packages. In addition we will reporton some simple investigations we have done using the software, and outline ourplans for further progress.

  10. D3GB: An Interactive Genome Browser for R, Python, and WordPress.

    PubMed

    Barrios, David; Prieto, Carlos

    2017-05-01

    Genome browsers are useful not only for showing final results but also for improving analysis protocols, testing data quality, and generating result drafts. Its integration in analysis pipelines allows the optimization of parameters, which leads to better results. New developments that facilitate the creation and utilization of genome browsers could contribute to improving analysis results and supporting the quick visualization of genomic data. D3 Genome Browser is an interactive genome browser that can be easily integrated in analysis protocols and shared on the Web. It is distributed as an R package, a Python module, and a WordPress plugin to facilitate its integration in pipelines and the utilization of platform capabilities. It is compatible with popular data formats such as GenBank, GFF, BED, FASTA, and VCF, and enables the exploration of genomic data with a Web browser.

  11. EVA worksite analysis--use of computer analysis for EVA operations development and execution.

    PubMed

    Anderson, D

    1999-01-01

    To sustain the rate of extravehicular activity (EVA) required to assemble and maintain the International Space Station, we must enhance our ability to plan, train for, and execute EVAs. An underlying analysis capability has been developed to ensure EVA access to all external worksites as a starting point for ground training, to generate information needed for on-orbit training, and to react quickly to develop contingency EVA plans, techniques, and procedures. This paper describes the use of computer-based EVA worksite analysis techniques for EVA worksite design. EVA worksite analysis has been used to design 80% of EVA worksites on the U.S. portion of the International Space Station. With the launch of the first U.S. element of the station, EVA worksite analysis is being developed further to support real-time analysis of unplanned EVA operations. This paper describes this development and deployment of EVA worksite analysis for International Space Station (ISS) mission support.

  12. Cam-driven monochromator for QEXAFS

    NASA Astrophysics Data System (ADS)

    Caliebe, W. A.; So, I.; Lenhard, A.; Siddons, D. P.

    2006-11-01

    We have developed a cam-drive for quickly tuning the energy of an X-ray monochromator through an X-ray absorption edge for quick extended X-ray absorption spectroscopy (QEXAFS). The data are collected using a 4-channel, 12-bit multiplexed VME analog to digital converter and a VME angle encoder. The VME crate controller runs a real-time operating system. This system is capable of collecting 2 EXAFS-scans in 1 s with an energy stability of better than 1 eV. Additional improvements to increase the speed and the energy stability are under way.

  13. SPICE-Based Python Packages for ESA Solar System Exploration Mission's Geometry Exploitation

    NASA Astrophysics Data System (ADS)

    Costa, M.; Grass, M.

    2018-04-01

    This contribution outlines three Python packages to provide an enhanced and extended usage of SPICE Toolkit APIS providing higher-level functions and data quick-look capabilities focused on European Space Agency solar system exploration missions.

  14. Piezoelectric Diffraction-Based Optical Switches

    NASA Technical Reports Server (NTRS)

    Spremo, Stevan; Fuhr, Peter; Schipper, John

    2003-01-01

    Piezoelectric diffraction-based optoelectronic devices have been invented to satisfy requirements for switching signals quickly among alternative optical paths in optical communication networks. These devices are capable of operating with switching times as short as microseconds or even nanoseconds in some cases.

  15. A New Metre for Cheap, Quick, Reliable and Simple Thermal Transmittance (U-Value) Measurements in Buildings.

    PubMed

    Andújar Márquez, José Manuel; Martínez Bohórquez, Miguel Ángel; Gómez Melgar, Sergio

    2017-09-03

    This paper deals with the thermal transmittance measurement focused on buildings and specifically in building energy retrofitting. Today, if many thermal transmittance measurements in a short time are needed, the current devices, based on the measurement of the heat flow through the wall, cannot carry out them, except if a great amount of devices are used at once along with intensive and tedious post-processing and analysis work. In this paper, from well-known physical laws, authors develop a methodology based on three temperatures measurements, which is implemented by a novel thermal transmittance metre. The paper shows its development step by step. As a result the developed device is modular, scalable, and fully wireless; it is capable of taking as many measurements at once as user needs. The developed system is compared working together on a same test to the currently used one based on heat flow. The results show that the developed metre allows carrying out thermal transmittance measurements in buildings in a cheap, quick, reliable and simple way.

  16. Ionospheric Correction Based on Ingestion of Global Ionospheric Maps into the NeQuick 2 Model

    PubMed Central

    Yu, Xiao; She, Chengli; Zhen, Weimin; Bruno, Nava; Liu, Dun; Yue, Xinan; Ou, Ming; Xu, Jisheng

    2015-01-01

    The global ionospheric maps (GIMs), generated by Jet Propulsion Laboratory (JPL) and Center for Orbit Determination in Europe (CODE) during a period over 13 years, have been adopted as the primary source of data to provide global ionospheric correction for possible single frequency positioning applications. The investigation aims to assess the performance of new NeQuick model, NeQuick 2, in predicting global total electron content (TEC) through ingesting the GIMs data from the previous day(s). The results show good performance of the GIMs-driven-NeQuick model with average 86% of vertical TEC error less than 10 TECU, when the global daily effective ionization indices (Az) versus modified dip latitude (MODIP) are constructed as a second order polynomial. The performance of GIMs-driven-NeQuick model presents variability with solar activity and behaves better during low solar activity years. The accuracy of TEC prediction can be improved further through performing a four-coefficient function expression of Az versus MODIP. As more measurements from earlier days are involved in the Az optimization procedure, the accuracy may decrease. The results also reveal that more efforts are needed to improve the NeQuick 2 model capabilities to represent the ionosphere in the equatorial and high-latitude regions. PMID:25815369

  17. The evaluation of a new technology for gunshot residue (GSR) analysis in the field

    NASA Astrophysics Data System (ADS)

    Hondrogiannis, Ellen; Andersen, Danielle; Miziolek, Andrzej W.

    2013-05-01

    There continues to be a need for improved technology to be used in theater to quickly and accurately identify the person who shot any weapon during a terrorist attack as well as to link a suspect to the actual weapon fired during a crime. Beyond this, in areas of conflict it would be desirable to have the capability to establish the source country for weaponry and ammunition. Gunshot residue (GSR) analysis is a reasonably well-studied technology area. Recent scientific publications have reported that the residues have a rich composition of both organic and inorganic compounds. For the purposes of identifying the manufacturer or country of origin for the ammunition, the inorganic components of GSR appear to be especially promising since their presence in the propellant and primer formulations are either specific to a given chemical formula, or they represent impurities in the manufacturing process that can be unique to a manufacturer or the source country for the chemicals used for propellants and primers. The Laser Induced Breakdown Spectroscopy (LIBS) technology has already demonstrated considerable capability for elemental fingerprinting, especially for inorganic/metallic components. A number of reports have demonstrated LIBS capability in forensics for matching materials such as inks, fabrics, paper, glass, and paint. This work describes the encouraging results of an initial study to assess a new commercial field-portable (battery operated) LIBS system for GSR analysis with gunshot residues having been collected from inside cartridge casings from 3 different ammunition manufacturers.

  18. Benefits of advanced space suits for supporting routine extravehicular activity

    NASA Technical Reports Server (NTRS)

    Alton, L. R.; Bauer, E. H.; Patrick, J. W.

    1975-01-01

    Technology is available to produce space suits providing a quick-reaction, safe, much more mobile extravehicular activity (EVA) capability than before. Such a capability may be needed during the shuttle era because the great variety of missions and payloads complicates the development of totally automated methods of conducting operations and maintenance and resolving contingencies. Routine EVA now promises to become a cost-effective tool as less complex, serviceable, lower-cost payload designs utilizing this capability become feasible. Adoption of certain advanced space suit technologies is encouraged for reasons of economics as well as performance.

  19. Compliant Task Execution and Learning for Safe Mixed-Initiative Human-Robot Operations

    NASA Technical Reports Server (NTRS)

    Dong, Shuonan; Conrad, Patrick R.; Shah, Julie A.; Williams, Brian C.; Mittman, David S.; Ingham, Michel D.; Verma, Vandana

    2011-01-01

    We introduce a novel task execution capability that enhances the ability of in-situ crew members to function independently from Earth by enabling safe and efficient interaction with automated systems. This task execution capability provides the ability to (1) map goal-directed commands from humans into safe, compliant, automated actions, (2) quickly and safely respond to human commands and actions during task execution, and (3) specify complex motions through teaching by demonstration. Our results are applicable to future surface robotic systems, and we have demonstrated these capabilities on JPL's All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) robot.

  20. Accurate evaluation and analysis of functional genomics data and methods

    PubMed Central

    Greene, Casey S.; Troyanskaya, Olga G.

    2016-01-01

    The development of technology capable of inexpensively performing large-scale measurements of biological systems has generated a wealth of data. Integrative analysis of these data holds the promise of uncovering gene function, regulation, and, in the longer run, understanding complex disease. However, their analysis has proved very challenging, as it is difficult to quickly and effectively assess the relevance and accuracy of these data for individual biological questions. Here, we identify biases that present challenges for the assessment of functional genomics data and methods. We then discuss evaluation methods that, taken together, begin to address these issues. We also argue that the funding of systematic data-driven experiments and of high-quality curation efforts will further improve evaluation metrics so that they more-accurately assess functional genomics data and methods. Such metrics will allow researchers in the field of functional genomics to continue to answer important biological questions in a data-driven manner. PMID:22268703

  1. Analysis of peptides using an integrated microchip HPLC-MS/MS system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Brian J.; Chirica, Gabriela S.; Reichmuth, David S.

    Hyphendated LC-MS techniques are quickly becoming the standard tool for protemic analyses. For large homogeneous samples, bulk processing methods and capillary injection and separation techniques are suitable. However, for analysis of small or heterogeneous samples, techniques that can manipulate picoliter samples without dilution are required or samples will be lost or corrupted; further, static nanospray-type flowrates are required to maximize SNR. Microchip-level integration of sample injection with separation and mass spectrometry allow small-volume analytes to be processed on chip and immediately injected without dilution for analysis. An on-chip HPLC was fabricated using in situ polymerization of both fixed and mobilemore » polymer monoliths. Integration of the chip with a nanospray MS emitter enables identification of peptides by the use of tandem MS. The chip is capable of analyzing of very small sample volumes (< 200 pl) in short times (< 3 min).« less

  2. The active video games' narrative impact on children's physical activities

    USDA-ARS?s Scientific Manuscript database

    Active video games (AVGs) capable of inducing physical activity offer an innovative approach to combating childhood obesity. Unfortunately, children's AVG game play decreases quickly, underscoring the need to identify novel methods for player engagement. Narratives have been demonstrated to influenc...

  3. Physiological Capacities: Estimating an Athlete's Potential.

    ERIC Educational Resources Information Center

    Lemon, Peter W. R.

    1982-01-01

    Several simple performance tests are described for assessing an athlete's major energy-producing capabilities. The tests are suitable for mass screening because they are easy to administer, require no sophisticated equipment, and can be done quickly. Information for evaluating test results is included. (PP)

  4. Tri-Laboratory Linux Capacity Cluster 2007 SOW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seager, M

    2007-03-22

    The Advanced Simulation and Computing (ASC) Program (formerly know as Accelerated Strategic Computing Initiative, ASCI) has led the world in capability computing for the last ten years. Capability computing is defined as a world-class platform (in the Top10 of the Top500.org list) with scientific simulations running at scale on the platform. Example systems are ASCI Red, Blue-Pacific, Blue-Mountain, White, Q, RedStorm, and Purple. ASC applications have scaled to multiple thousands of CPUs and accomplished a long list of mission milestones on these ASC capability platforms. However, the computing demands of the ASC and Stockpile Stewardship programs also include a vastmore » number of smaller scale runs for day-to-day simulations. Indeed, every 'hero' capability run requires many hundreds to thousands of much smaller runs in preparation and post processing activities. In addition, there are many aspects of the Stockpile Stewardship Program (SSP) that can be directly accomplished with these so-called 'capacity' calculations. The need for capacity is now so great within the program that it is increasingly difficult to allocate the computer resources required by the larger capability runs. To rectify the current 'capacity' computing resource shortfall, the ASC program has allocated a large portion of the overall ASC platforms budget to 'capacity' systems. In addition, within the next five to ten years the Life Extension Programs (LEPs) for major nuclear weapons systems must be accomplished. These LEPs and other SSP programmatic elements will further drive the need for capacity calculations and hence 'capacity' systems as well as future ASC capability calculations on 'capability' systems. To respond to this new workload analysis, the ASC program will be making a large sustained strategic investment in these capacity systems over the next ten years, starting with the United States Government Fiscal Year 2007 (GFY07). However, given the growing need for 'capability' systems as well, the budget demands are extreme and new, more cost effective ways of fielding these systems must be developed. This Tri-Laboratory Linux Capacity Cluster (TLCC) procurement represents the ASC first investment vehicle in these capacity systems. It also represents a new strategy for quickly building, fielding and integrating many Linux clusters of various sizes into classified and unclassified production service through a concept of Scalable Units (SU). The programmatic objective is to dramatically reduce the overall Total Cost of Ownership (TCO) of these 'capacity' systems relative to the best practices in Linux Cluster deployments today. This objective only makes sense in the context of these systems quickly becoming very robust and useful production clusters under the crushing load that will be inflicted on them by the ASC and SSP scientific simulation capacity workload.« less

  5. Studying quick coupler efficiency in working attachment system of single-bucket power shovel

    NASA Astrophysics Data System (ADS)

    Duganova, E. V.; Zagorodniy, N. A.; Solodovnikov, D. N.; Korneyev, A. S.

    2018-03-01

    A prototype of a quick-disconnect connector (quick coupler) with an unloaded retention mechanism was developed from the analysis of typical quick couplers used as intermediate elements for power shovels of different manufacturers. A method is presented, allowing building a simulation model of the quick coupler prototype as an alternative to physical modeling for further studies.

  6. The Viking viewer for connectomics: scalable multi-user annotation and summarization of large volume data sets.

    PubMed

    Anderson, J R; Mohammed, S; Grimm, B; Jones, B W; Koshevoy, P; Tasdizen, T; Whitaker, R; Marc, R E

    2011-01-01

    Modern microscope automation permits the collection of vast amounts of continuous anatomical imagery in both two and three dimensions. These large data sets present significant challenges for data storage, access, viewing, annotation and analysis. The cost and overhead of collecting and storing the data can be extremely high. Large data sets quickly exceed an individual's capability for timely analysis and present challenges in efficiently applying transforms, if needed. Finally annotated anatomical data sets can represent a significant investment of resources and should be easily accessible to the scientific community. The Viking application was our solution created to view and annotate a 16.5 TB ultrastructural retinal connectome volume and we demonstrate its utility in reconstructing neural networks for a distinctive retinal amacrine cell class. Viking has several key features. (1) It works over the internet using HTTP and supports many concurrent users limited only by hardware. (2) It supports a multi-user, collaborative annotation strategy. (3) It cleanly demarcates viewing and analysis from data collection and hosting. (4) It is capable of applying transformations in real-time. (5) It has an easily extensible user interface, allowing addition of specialized modules without rewriting the viewer. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  7. Experimental visualization of rapid maneuvering fish

    NASA Astrophysics Data System (ADS)

    Daigh, S.; Techet, A. H.

    2003-11-01

    A freshwater tropical fish, Danio aequippinatus, is studied undergoing rapid turning and fast starting maneuvers. This agile species of fish is ideal for this study as it is capable of quick turning and darting motions up to 5g's. The fgish studied are 4-5 cm in length. The speed and kinematics of the maneuvering is determined by video analysis. Planar and stereo Particle Image Velocimetry (PIV) is used to map the vortical patterns in the wake of the maneuvering fish. PIV visualizations reveal that during C-shaped maneuvers a ring shaped jet vortex is formed. Fast starting behavior is also presented. PIV data is used to approixmate the thrust vectoring force produced during each maneuver.

  8. Design automation techniques for custom LSI arrays

    NASA Technical Reports Server (NTRS)

    Feller, A.

    1975-01-01

    The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.

  9. Heating of cardiovascular stents in intense radiofrequency magnetic fields.

    PubMed

    Foster, K R; Goldberg, R; Bonsignore, C

    1999-01-01

    We consider the heating of a metal stent in an alternating magnetic field from an induction heating furnace. An approximate theoretical analysis is conducted to estimate the magnetic field strength needed to produce substantial temperature increases. Experiments of stent heating in industrial furnaces are reported, which confirm the model. The results show that magnetic fields inside inductance furnaces are capable of significantly heating stents. However, the fields fall off very quickly with distance and in most locations outside the heating coil, field levels are far too small to produce significant heating. The ANSI/IEEE C95.1-1992 limits for human exposure to alternating magnetic fields provide adequate protection against potential excessive heating of the stents.

  10. Cost Model Comparison: A Study of Internally and Commercially Developed Cost Models in Use by NASA

    NASA Technical Reports Server (NTRS)

    Gupta, Garima

    2011-01-01

    NASA makes use of numerous cost models to accurately estimate the cost of various components of a mission - hardware, software, mission/ground operations - during the different stages of a mission's lifecycle. The purpose of this project was to survey these models and determine in which respects they are similar and in which they are different. The initial survey included a study of the cost drivers for each model, the form of each model (linear/exponential/other CER, range/point output, capable of risk/sensitivity analysis), and for what types of missions and for what phases of a mission lifecycle each model is capable of estimating cost. The models taken into consideration consisted of both those that were developed by NASA and those that were commercially developed: GSECT, NAFCOM, SCAT, QuickCost, PRICE, and SEER. Once the initial survey was completed, the next step in the project was to compare the cost models' capabilities in terms of Work Breakdown Structure (WBS) elements. This final comparison was then portrayed in a visual manner with Venn diagrams. All of the materials produced in the process of this study were then posted on the Ground Segment Team (GST) Wiki.

  11. Differentiating Organic and Conventional Sage by Chromatographic and Mass Spectrometry Flow-Injection Fingerprints Combined with Principal Component Analysis

    PubMed Central

    Gao, Boyan; Lu, Yingjian; Sheng, Yi; Chen, Pei; Yu, Liangli (Lucy)

    2013-01-01

    High performance liquid chromatography (HPLC) and flow injection electrospray ionization with ion trap mass spectrometry (FIMS) fingerprints combined with the principal component analysis (PCA) were examined for their potential in differentiating commercial organic and conventional sage samples. The individual components in the sage samples were also characterized with an ultra-performance liquid chromatography with a quadrupole-time of flight mass spectrometer (UPLC Q-TOF MS). The results suggested that both HPLC and FIMS fingerprints combined with PCA could differentiate organic and conventional sage samples effectively. FIMS may serve as a quick test capable of distinguishing organic and conventional sages in 1 min, and could potentially be developed for high-throughput applications; whereas HPLC fingerprints could provide more chemical composition information with a longer analytical time. PMID:23464755

  12. Narrative increases step counts during active video game play among children

    USDA-ARS?s Scientific Manuscript database

    Active video games (AVGs) capable of inducing physical activity (PA) level offer a novel alternative to child obesity. Unfortunately, children's motivation to play AVG decreases quickly, underscoring the need to find new methods to maintain their engagement. According to narrative transportation th...

  13. Creating Services for the Digital Library.

    ERIC Educational Resources Information Center

    Crane, Dennis J.

    The terms "virtual library,""digital library," and "electronic library" have received growing attention among professional librarians, researchers, and users of information over the past decade. The confluence of exploding sources of data, expanding technical capability, and constrained time and money will quickly move these concepts from…

  14. The 3.5-Meter Telescope Enclosure

    DTIC Science & Technology

    1994-04-01

    and acoustic vibrations, and the enclosure cannot be stopped quickly in an emergency. Also, the work of Zago indicates that open-air operation of the...enclosure. This capability is useful during operational testing and maintenance of the telescope. ’ Zago , L., "Design and Performance of Large

  15. Environmental Response Laboratory Network (ERLN) Basic Ordering Agreement Fact Sheet

    EPA Pesticide Factsheets

    Having a standing agreement means that a laboratory has been vetted and has proven capable of providing certain services that meet ERLN standards, and it provides a mechanism for EPA to quickly task a lab to provide supplies/services during an indicent.

  16. Mobilization and Defense Management Technical Reports Series. Long War Versus Short War: An Appraisal of Policy.

    DTIC Science & Technology

    1983-05-01

    based initially on tecnology and doctrine both World Wars had a different outoe for reasons of sustainability and national determination. Assuredly the...intensity of a future conflict gives rise to serious concern over our capability to mobilize, train, deploy and sustain the force. Emhasis on DOD...purpose of war as a "continuation of policy by other means, " 4 had as its main purpose the quick destruction of enemy fighting capability . "They must be

  17. Analysis of Inorganic Nanoparticles by Single-particle Inductively Coupled Plasma Time-of-Flight Mass Spectrometry.

    PubMed

    Hendriks, Lyndsey; Gundlach-Graham, Alexander; Günther, Detlef

    2018-04-25

    Due to the rapid development of nanotechnologies, engineered nanomaterials (ENMs) and nanoparticles (ENPs) are becoming a part of everyday life: nanotechnologies are quickly migrating from laboratory benches to store shelves and industrial processes. As the use of ENPs continues to expand, their release into the environment is unavoidable; however, understanding the mechanisms and degree of ENP release is only possible through direct detection of these nanospecies in relevant matrices and at realistic concentrations. Key analytical requirements for quantitative detection of ENPs include high sensitivity to detect small particles at low total mass concentrations and the need to separate signals of ENPs from a background of dissolved elemental species and natural nanoparticles (NNPs). To this end, an emerging method called single-particle inductively coupled plasma mass spectrometry (sp-ICPMS) has demonstrated great potential for the characterization of inorganic nanoparticles (NPs) at environmentally relevant concentrations. Here, we comment on the capabilities of modern sp-ICPMS analysis with particular focus on the measurement possibilities offered by ICP-time-of-flight mass spectrometry (ICP-TOFMS). ICP-TOFMS delivers complete elemental mass spectra for individual NPs, which allows for high-throughput, untargeted quantitative analysis of dispersed NPs in natural matrices. Moreover, the multi-element detection capabilities of ICP-TOFMS enable new NP-analysis strategies, including online calibration via microdroplets for accurate NP mass quantification and matrix compensation.

  18. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Astrophysics Data System (ADS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-05-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code. We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higher-quality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the Big-Data analysis environments.

  19. Anticipated Changes in Conducting Scientific Data-Analysis Research in the Big-Data Era

    NASA Technical Reports Server (NTRS)

    Kuo, Kwo-Sen; Seablom, Michael; Clune, Thomas; Ramachandran, Rahul

    2014-01-01

    A Big-Data environment is one that is capable of orchestrating quick-turnaround analyses involving large volumes of data for numerous simultaneous users. Based on our experiences with a prototype Big-Data analysis environment, we anticipate some important changes in research behaviors and processes while conducting scientific data-analysis research in the near future as such Big-Data environments become the mainstream. The first anticipated change will be the reduced effort and difficulty in most parts of the data management process. A Big-Data analysis environment is likely to house most of the data required for a particular research discipline along with appropriate analysis capabilities. This will reduce the need for researchers to download local copies of data. In turn, this also reduces the need for compute and storage procurement by individual researchers or groups, as well as associated maintenance and management afterwards. It is almost certain that Big-Data environments will require a different "programming language" to fully exploit the latent potential. In addition, the process of extending the environment to provide new analysis capabilities will likely be more involved than, say, compiling a piece of new or revised code.We thus anticipate that researchers will require support from dedicated organizations associated with the environment that are composed of professional software engineers and data scientists. A major benefit will likely be that such extensions are of higherquality and broader applicability than ad hoc changes by physical scientists. Another anticipated significant change is improved collaboration among the researchers using the same environment. Since the environment is homogeneous within itself, many barriers to collaboration are minimized or eliminated. For example, data and analysis algorithms can be seamlessly shared, reused and re-purposed. In conclusion, we will be able to achieve a new level of scientific productivity in the Big-Data analysis environments.

  20. Quick concurrent responses to global and local cognitive information underlie intuitive understanding in board-game experts

    PubMed Central

    Nakatani, Hironori; Yamaguchi, Yoko

    2014-01-01

    Experts have the superior cognitive capability of quickly understanding complex information in their domain; however, little is known about the neural processes underlying this ability. Here, using a board game named shogi (Japanese chess), we investigated the brain activity in expert players that was involved in their quick understanding of board-game patterns. The frontal area responded only to meaningful game positions, whereas the temporal area responded to both game and random positions with the same latency (200 ms). Subsequent to these quick responses, the temporal and parietal areas responded only to game positions, with a latency of 700 ms. During the responses, enhanced phase synchronization between these areas was observed. Thus, experts first responded to global cognitive information that was specific to game positions and to local cognitive information that was common to game and random positions concurrently. These types of information were integrated via neural synchronization at the posterior areas. As these properties were specific to experts, much of the experts' advantage in understanding game positions occurred within 1 s of perception. PMID:25081320

  1. Increased ISR operator capability utilizing a centralized 360° full motion video display

    NASA Astrophysics Data System (ADS)

    Andryc, K.; Chamberlain, J.; Eagleson, T.; Gottschalk, G.; Kowal, B.; Kuzdeba, P.; LaValley, D.; Myers, E.; Quinn, S.; Rose, M.; Rusiecki, B.

    2012-06-01

    In many situations, the difference between success and failure comes down to taking the right actions quickly. While the myriad of electronic sensors available today can provide data quickly, it may overload the operator; where only a contextualized centralized display of information and intuitive human interface can help to support the quick and effective decisions needed. If these decisions are to result in quick actions, then the operator must be able to understand all of the data of his environment. In this paper we present a novel approach in contextualizing multi-sensor data onto a full motion video real-time 360 degree imaging display. The system described could function as a primary display system for command and control in security, military and observation posts. It has the ability to process and enable interactive control of multiple other sensor systems. It enhances the value of these other sensors by overlaying their information on a panorama of the surroundings. Also, it can be used to interface to other systems including: auxiliary electro-optical systems, aerial video, contact management, Hostile Fire Indicators (HFI), and Remote Weapon Stations (RWS).

  2. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  3. Proceedings of the 2004 High Spatial Resolution Commercial Imagery Workshop

    NASA Technical Reports Server (NTRS)

    2006-01-01

    Topics covered include: NASA Applied Sciences Program; USGS Land Remote Sensing: Overview; QuickBird System Status and Product Overview; ORBIMAGE Overview; IKONOS 2004 Calibration and Validation Status; OrbView-3 Spatial Characterization; On-Orbit Modulation Transfer Function (MTF) Measurement of QuickBird; Spatial Resolution Characterization for QuickBird Image Products 2003-2004 Season; Image Quality Evaluation of QuickBird Super Resolution and Revisit of IKONOS: Civil and Commercial Application Project (CCAP); On-Orbit System MTF Measurement; QuickBird Post Launch Geopositional Characterization Update; OrbView-3 Geometric Calibration and Geopositional Accuracy; Geopositional Statistical Methods; QuickBird and OrbView-3 Geopositional Accuracy Assessment; Initial On-Orbit Spatial Resolution Characterization of OrbView-3 Panchromatic Images; Laboratory Measurement of Bidirectional Reflectance of Radiometric Tarps; Stennis Space Center Verification and Validation Capabilities; Joint Agency Commercial Imagery Evaluation (JACIE) Team; Adjacency Effects in High Resolution Imagery; Effect of Pulse Width vs. GSD on MTF Estimation; Camera and Sensor Calibration at the USGS; QuickBird Geometric Verification; Comparison of MODTRAN to Heritage-based Results in Vicarious Calibration at University of Arizona; Using Remotely Sensed Imagery to Determine Impervious Surface in Sioux Falls, South Dakota; Estimating Sub-Pixel Proportions of Sagebrush with a Regression Tree; How Do YOU Use the National Land Cover Dataset?; The National Map Hazards Data Distribution System; Recording a Troubled World; What Does This-Have to Do with This?; When Can a Picture Save a Thousand Homes?; InSAR Studies of Alaska Volcanoes; Earth Observing-1 (EO-1) Data Products; Improving Access to the USGS Aerial Film Collections: High Resolution Scanners; Improving Access to the USGS Aerial Film Collections: Phoenix Digitizing System Product Distribution; System and Product Characterization: Issues Approach; Innovative Approaches to Analysis of Lidar Data for the National Map; Changes in Imperviousness near Military Installations; Geopositional Accuracy Evaluations of QuickBird and OrbView-3: Civil and Commercial Applications Project (CCAP); Geometric Accuracy Assessment: OrbView ORTHO Products; QuickBird Radiometric Calibration Update; OrbView-3 Radiometric Calibration; QuickBird Radiometric Characterization; NASA Radiometric Characterization; Establishing and Verifying the Traceability of Remote-Sensing Measurements to International Standards; QuickBird Applications; Airport Mapping and Perpetual Monitoring Using IKONOS; OrbView-3 Relative Accuracy Results and Impacts on Exploitation and Accuracy Improvement; Using Remotely Sensed Imagery to Determine Impervious Surface in Sioux Falls, South Dakota; Applying High-Resolution Satellite Imagery and Remotely Sensed Data to Local Government Applications: Sioux Falls, South Dakota; Automatic Co-Registration of QuickBird Data for Change Detection Applications; Developing Coastal Surface Roughness Maps Using ASTER and QuickBird Data Sources; Automated, Near-Real Time Cloud and Cloud Shadow Detection in High Resolution VNIR Imagery; Science Applications of High Resolution Imagery at the USGS EROS Data Center; Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research; Atmospheric Correction Prototype Algorithm for High Spatial Resolution Multispectral Earth Observing Imaging Systems; Determining Regional Arctic Tundra Carbon Exchange: A Bottom-Up Approach; Using IKONOS Imagery to Assess Impervious Surface Area, Riparian Buffers and Stream Health in the Mid-Atlantic Region; Commercial Remote Sensing Space Policy Civil Implementation Update; USGS Commercial Remote Sensing Data Contracts (CRSDC); and Commercial Remote Sensing Space Policy (CRSSP): Civil Near-Term Requirements Collection Update.

  4. The narrative impact of active video games on physical activity among children: A feasibility study

    USDA-ARS?s Scientific Manuscript database

    Active video games (AVGs) capable of inducing physical activity offer an innovative approach to combating childhood obesity. Unfortunately, children's AVG game play decreases quickly, underscoring the need to identify novel methods for player engagement. Narratives have been demonstrated to influenc...

  5. 46 CFR 197.346 - Diver's equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... using SCUBA must have— (1) Self-contained underwater breathing equipment including— (i) A primary..., gloves, shoes, weight assembly, and knife; (3) Have a hose group consisting of the breathing gas hose and... assembly capable of quick release; (3) A mask group consisting of a lightweight mask and associated valves...

  6. Pyro thruster for performing rocket booster attachment, disconnect, and jettison functions

    NASA Technical Reports Server (NTRS)

    Hornyak, Stephen

    1989-01-01

    The concept of a pyro thruster, combining an automatic structural attachment with quick disconnect and thrusting capability, is described. The purpose of the invention is to simplify booster installation, disengagement, and jettison functions for the U.S. Air Force Advanced Launch Systems (ALS) program.

  7. Handheld Diagnostic Device Delivers Quick Medical Readings

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To monitor astronauts' health remotely, Glenn Research Center awarded SBIR funding to Cambridge, Massachusetts-based DNA Medical Institute, which developed a device capable of analyzing blood cell counts and a variety of medical biomarkers. The technology will prove especially useful in rural areas without easy access to labs.

  8. Locating Changes in Land Use from Long Term Remote Sensing Data in Morocco

    EPA Science Inventory

    We present a method that allows mapping changes in vegetation cover over large areas quickly and inexpensively, thus providing policy makers with the capability to locate and assess areas undergoing environmental change, and improving their ability to positively respond or adapt ...

  9. Developmental Course of Impulsivity and Capability from Age 10 to Age 25 as Related to Trajectory of Suicide Attempt in a Community Cohort

    PubMed Central

    Kasen, Stephanie; Cohen, Patricia; Chen, Henian

    2011-01-01

    Hierarchical linear models were used to examine trajectories of impulsivity and capability between ages 10 and 25 in relation to suicide attempt in 770 youths followed longitudinally: intercepts were set at age 17. The impulsivity measure assessed features of urgency (e.g., poor control, quick provocation, and disregard for external constraints); the capability measure assessed aspects of self-esteem and mastery. Compared to nonattempters, attempters reported significantly higher impulsivity levels with less age-related decline, and significantly lower capability levels with less age-related increase. Independent of other risks, suicide attempt was related significantly to higher impulsivity between ages 10 and 25, especially during the younger years, and lower capability. Implications of those findings for further suicidal behavior and preventive/intervention efforts are discussed. PMID:21342218

  10. Preliminary engineering study: Quick opening valve MSFC high Reynolds number wind tunnel

    NASA Technical Reports Server (NTRS)

    1983-01-01

    FluiDyne Engineering Corporation has conducted a preliminary engineering study of a quick-opening valve for the MSFC High Reynolds Number Wind Tunnel under NASA Contract NAS8-35056. The subject valve is intended to replace the Mylar diaphragm system as the flow initiation device for the tunnel. Only valves capable of opening within 0.05 sec. and providing a minimum of 11.4 square feet of flow area were considered. Also, the study focused on valves which combined the quick-opening and tight shutoff features in a single unit. A ring sleeve valve concept was chosen for refinement and pricing. Sealing for tight shutoff, ring sleeve closure release and sleeve actuation were considered. The resulting cost estimate includes the valve and requisite modifications to the facility to accommodate the valve as well as the associated design and development work.

  11. Development of an Immunochromatography Assay (QuickNavi-Ebola) to Detect Multiple Species of Ebolaviruses

    PubMed Central

    Yoshida, Reiko; Muramatsu, Shino; Akita, Hiroshi; Saito, Yuji; Kuwahara, Miwa; Kato, Daisuke; Changula, Katendi; Miyamoto, Hiroko; Kajihara, Masahiro; Manzoor, Rashid; Furuyama, Wakako; Marzi, Andrea; Feldmann, Heinz; Mweene, Aaron; Masumu, Justin; Kapeteshi, Jimmy; Muyembe-Tamfum, Jean-Jacques; Takada, Ayato

    2016-01-01

    The latest outbreak of Ebola virus disease (EVD) in West Africa has highlighted the urgent need for the development of rapid and reliable diagnostic assays. We used monoclonal antibodies specific to the ebolavirus nucleoprotein to develop an immunochromatography (IC) assay (QuickNavi-Ebola) for rapid diagnosis of EVD. The IC assay was first evaluated with tissue culture supernatants of infected Vero E6 cells and found to be capable of detecting 103–104 focus-forming units/mL of ebolaviruses. Using serum samples from experimentally infected nonhuman primates, we confirmed that the assay could detect the viral antigen shortly after disease onset. It was also noted that multiple species of ebolaviruses could be detected by the IC assay. Owing to the simplicity of the assay procedure and absence of requirements for special equipment and training, QuickNavi-Ebola is expected to be a useful tool for rapid diagnosis of EVD. PMID:27462094

  12. Fast-pulverization enabled simultaneous enhancement on cycling stability and rate capability of C@NiFe2O4 hierarchical fibrous bundle

    NASA Astrophysics Data System (ADS)

    Chen, Zerui; Zhang, Yu; Wang, Xiaoling; Sun, Wenping; Dou, Shixue; Huang, Xin; Shi, Bi

    2017-09-01

    Electrochemical-grinding induced pulverization is the origin of capacity fading in NiFe2O4. Increasing current density normally accelerates the pulverization that deteriorates lithium storage properties of NiFe2O4. Here we show that the high current induced fast-pulverization can serve as an efficient activation strategy for quick and simultaneous enhancement on cycling stability and rate capability of NiFe2O4 nanoparticles (NPs) that are densely packed on the hierarchically structured carbon nanofiber strand. At a high current density, the pulverization of NiFe2O4 NPs can be accomplished in a few cycles exposing more active surface. During the fast-pulverization, the hierarchically structured carbon nanofiber strand maintains conductive contact for the densely packed NiFe2O4 NPs regardless of charge or discharge, which also effectively suppresses the repetitive breaks and growths of solid-electrolyte-interphase (SEI) via multiple-level structural adaption that favourites the quick formation of a thin and dense SEI, thus providing strong interparticle connectivity with enhancement on cycling stability and rate capability (e.g. doubled capacity). Our findings demonstrate the potential importance of high current induced fast-pulverization as an efficient activation strategy for achieving durable electrode materials suffering from electrochemical-grinding effects.

  13. How Intuition Contributes to High Performance: An Educational Perspective

    ERIC Educational Resources Information Center

    Harteis, Christian; Koch, Tina; Morgenthaler, Barbara

    2008-01-01

    Intuition usually is defined as the capability to act or decide appropriately without deliberately and consciously balancing alternatives, without following a certain rule or routine, and possibly without awareness (Gigerenzer, 2007; Hogarth, 2001; Klein, 2003; Myers, 2002). It allows action which is quick (e.g. reaction to a challenging…

  14. Wheat productivity estimates using LANDSAT data. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F.; Colwell, J. (Principal Investigator); Rice, D. P.

    1977-01-01

    The author has identified the following significant results. An initial demonstration was made of the capability to make direct production forecasts for winter wheat using early season LANDSAT data. The approach offers the potential to make production forecasts quickly and simply, possibly avoiding some of the complexities of alternate procedures.

  15. Neural network applications in telecommunications

    NASA Technical Reports Server (NTRS)

    Alspector, Joshua

    1994-01-01

    Neural network capabilities include automatic and organized handling of complex information, quick adaptation to continuously changing environments, nonlinear modeling, and parallel implementation. This viewgraph presentation presents Bellcore work on applications, learning chip computational function, learning system block diagram, neural network equalization, broadband access control, calling-card fraud detection, software reliability prediction, and conclusions.

  16. Proofs through Exploration in Dynamic Geometry Environments

    ERIC Educational Resources Information Center

    Christou, Constantinos; Mousoulides, Nikos; Pittalis, Marios; Pitta-Pantazi, Demetra

    2004-01-01

    The recent development of powerful new technologies such as dynamic geometry software (DGS) with drag capability has made possible the continuous variation of geometric configurations and allows one to quickly and easily investigate whether particular conjectures are true or not. Because of the inductive nature of the DGS, the…

  17. Proofs through Exploration in Dynamic Geometry Environments

    ERIC Educational Resources Information Center

    Christou, C.; Mousoulides, N.; Pittalis, M.; Pitta-Pantazi, D.

    2004-01-01

    The recent development of powerful new technologies such as dynamic geometry softwares (DGS) with drag capability has made possible the continuous variation of geometric configurations and allows one to quickly and easily investigate whether particular conjectures are true or not. Because of the inductive nature of the DGS, the…

  18. Identifying Francisella tularensis genes required for growth in host cells

    USDA-ARS?s Scientific Manuscript database

    Technical Abstract: Francisella tularensis is a highly virulent Gram negative intracellular pathogen capable of infecting a vast diversity of hosts, ranging from amoebae to humans. A hallmark of F. tularensis virulence is its ability to quickly grow to high densities within a diverse set of host cel...

  19. The Road from a Quick Reaction Capability to a Program of Record

    DTIC Science & Technology

    2011-01-26

    Automated Biometric Identification System (DoD-ABIS) is to store, match, and share national biometric resources for adversary (red force) and neutral...Node Relationship Metrics 20090100021 28 MAJOR Mug Book and Lineups 20090100022 29 MAJOR US Persons Marking 20090100023 30 MAJOR Detention Data in

  20. The role of motion analysis in elite soccer: contemporary performance measurement techniques and work rate data.

    PubMed

    Carling, Christopher; Bloomfield, Jonathan; Nelsen, Lee; Reilly, Thomas

    2008-01-01

    The optimal physical preparation of elite soccer (association football) players has become an indispensable part of the professional game, especially due to the increased physical demands of match-play. The monitoring of players' work rate profiles during competition is now feasible through computer-aided motion analysis. Traditional methods of motion analysis were extremely labour intensive and were largely restricted to university-based research projects. Recent technological developments have meant that sophisticated systems, capable of quickly recording and processing the data of all players' physical contributions throughout an entire match, are now being used in elite club environments. In recognition of the important role that motion analysis now plays as a tool for measuring the physical performance of soccer players, this review critically appraises various motion analysis methods currently employed in elite soccer and explores research conducted using these methods. This review therefore aims to increase the awareness of both practitioners and researchers of the various motion analysis systems available, and identify practical implications of the established body of knowledge, while highlighting areas that require further exploration.

  1. Engineering visualization utilizing advanced animation

    NASA Technical Reports Server (NTRS)

    Sabionski, Gunter R.; Robinson, Thomas L., Jr.

    1989-01-01

    Engineering visualization is the use of computer graphics to depict engineering analysis and simulation in visual form from project planning through documentation. Graphics displays let engineers see data represented dynamically which permits the quick evaluation of results. The current state of graphics hardware and software generally allows the creation of two types of 3D graphics. The use of animated video as an engineering visualization tool is presented. The engineering, animation, and videography aspects of animated video production are each discussed. Specific issues include the integration of staffing expertise, hardware, software, and the various production processes. A detailed explanation of the animation process reveals the capabilities of this unique engineering visualization method. Automation of animation and video production processes are covered and future directions are proposed.

  2. gadfly: A pandas-based Framework for Analyzing GADGET Simulation Data

    NASA Astrophysics Data System (ADS)

    Hummel, Jacob A.

    2016-11-01

    We present the first public release (v0.1) of the open-source gadget Dataframe Library: gadfly. The aim of this package is to leverage the capabilities of the broader python scientific computing ecosystem by providing tools for analyzing simulation data from the astrophysical simulation codes gadget and gizmo using pandas, a thoroughly documented, open-source library providing high-performance, easy-to-use data structures that is quickly becoming the standard for data analysis in python. Gadfly is a framework for analyzing particle-based simulation data stored in the HDF5 format using pandas DataFrames. The package enables efficient memory management, includes utilities for unit handling, coordinate transformations, and parallel batch processing, and provides highly optimized routines for visualizing smoothed-particle hydrodynamics data sets.

  3. Real time computer data system for the 40 x 80 ft wind tunnel facility at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Tolari, G. P.

    1974-01-01

    The wind tunnel realtime computer system is a distributed data gathering system that features a master computer subsystem, a high speed data gathering subsystem, a quick look dynamic analysis and vibration control subsystem, an analog recording back-up subsystem, a pulse code modulation (PCM) on-board subsystem, a communications subsystem, and a transducer excitation and calibration subsystem. The subsystems are married to the master computer through an executive software system and standard hardware and FORTRAN software interfaces. The executive software system has four basic software routines. These are the playback, setup, record, and monitor routines. The standard hardware interfaces along with the software interfaces provide the system with the capability of adapting to new environments.

  4. The Role of Data Archives in Synoptic Solar Physics

    NASA Astrophysics Data System (ADS)

    Reardon, Kevin

    The detailed study of solar cycle variations requires analysis of recorded datasets spanning many years of observations, that is, a data archive. The use of digital data, combined with powerful database server software, gives such archives new capabilities to provide, quickly and flexibly, selected pieces of information to scientists. Use of standardized protocols will allow multiple databases, independently maintained, to be seamlessly joined, allowing complex searches spanning multiple archives. These data archives also benefit from being developed in parallel with the telescope itself, which helps to assure data integrity and to provide close integration between the telescope and archive. Development of archives that can guarantee long-term data availability and strong compatibility with other projects makes solar-cycle studies easier to plan and realize.

  5. Communication interference/jamming and propagation analysis system and its application to radio location

    NASA Astrophysics Data System (ADS)

    Kuzucu, H.

    1992-11-01

    Modern defense systems depend on comprehensive surveillance capability. The ability to detect and locate the radio signals is a major element of a surveillance system. With the increasing need for more mobile surveillance systems in conjunction with the rapid deployment of forces and the advent of technology allowing more enhanced use of small aperture systems, tactical direction finding (DF) and radiolocation systems will have to be operated in diverse operational conditions. A quick assessment of the error levels expected and the evaluation of the reliability of the fixes on the targeted areas bears crucial importance to the effectiveness of the missions relying on DF data. This paper presents a sophisticated, graphics workstation based computer tool developed for the system level analysis of radio communication systems and describes its use in radiolocation applications for realizing such accurate and realistic assessments with substantial money and time savings.

  6. A Gateway for Phylogenetic Analysis Powered by Grid Computing Featuring GARLI 2.0

    PubMed Central

    Bazinet, Adam L.; Zwickl, Derrick J.; Cummings, Michael P.

    2014-01-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. [garli, gateway, grid computing, maximum likelihood, molecular evolution portal, phylogenetics, web service.] PMID:24789072

  7. A gateway for phylogenetic analysis powered by grid computing featuring GARLI 2.0.

    PubMed

    Bazinet, Adam L; Zwickl, Derrick J; Cummings, Michael P

    2014-09-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  8. OverPlotter: A Utility for Herschel Data Processing

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Mei, Y.; Schulz, B.

    2008-08-01

    The OverPlotter utility is a GUI tool written in Java to support interactive data processing (DP) and analysis for the Herschel Space Observatory within the framework of the Herschel Common Science System (HCSS)(Wieprecht et al 2004). The tool expands upon the capabilities of the TableViewer (Zhang & Schulz 2005), providing now also the means to create additional overlays of several X/Y scatter plots within the same display area. These layers can be scaled and panned, either individually, or together as one graph. Visual comparison of data with different origins and units becomes much easier. The number of available layers is not limited, except by computer memory and performance. Presentation images can be easily created by adding annotations, labeling layers and setting colors. The tool will be very helpful especially in the early phases of Herschel data analysis, when a quick access to contents of data products is important.

  9. Image reconstruction of muon tomographic data using a density-based clustering method

    NASA Astrophysics Data System (ADS)

    Perry, Kimberly B.

    Muons are subatomic particles capable of reaching the Earth's surface before decaying. When these particles collide with an object that has a high atomic number (Z), their path of travel changes substantially. Tracking muon movement through shielded containers can indicate what types of materials lie inside. This thesis proposes using a density-based clustering algorithm called OPTICS to perform image reconstructions using muon tomographic data. The results show that this method is capable of detecting high-Z materials quickly, and can also produce detailed reconstructions with large amounts of data.

  10. Developmental course of impulsivity and capability from age 10 to age 25 as related to trajectory of suicide attempt in a community cohort.

    PubMed

    Kasen, Stephanie; Cohen, Patricia; Chen, Henian

    2011-04-01

    Hierarchical linear models were used to examine trajectories of impulsivity and capability between ages 10 and 25 in relation to suicide attempt in 770 youths followed longitudinally: intercepts were set at age 17. The impulsivity measure assessed features of urgency (e.g., poor control, quick provocation, and disregard for external constraints); the capability measure assessed aspects of self-esteem and mastery. Compared to nonattempters, attempters reported significantly higher impulsivity levels with less age-related decline, and significantly lower capability levels with less age-related increase. Independent of other risks, suicide attempt was related significantly to higher impulsivity between ages 10 and 25, especially during the younger years, and lower capability. Implications of those findings for further suicidal behavior and preventive/intervention efforts are discussed. © 2011 The American Association of Suicidology.

  11. Laser Direct Routing for High Density Interconnects

    NASA Astrophysics Data System (ADS)

    Moreno, Wilfrido Alejandro

    The laser restructuring of electronic circuits fabricated using standard Very Large Scale Integration (VLSI) process techniques, is an excellent alternative that allows low-cost quick turnaround production with full circuit similarity between the Laser Restructured prototype and the customized product for mass production. Laser Restructurable VLSI (LRVLSI) would allow design engineers the capability to interconnect cells that implement generic logic functions and signal processing schemes to achieve a higher level of design complexity. LRVLSI of a particular circuit at the wafer or packaged chip level is accomplished using an integrated computer controlled laser system to create low electrical resistance links between conductors and to cut conductor lines. An infrastructure for rapid prototyping and quick turnaround using Laser Restructuring of VLSI circuits was developed to meet three main parallel objectives: to pursue research on novel interconnect technologies using LRVLSI, to develop the capability of operating in a quick turnaround mode, and to maintain standardization and compatibility with commercially available equipment for feasible technology transfer. The system is to possess a high degree of flexibility, high data quality, total controllability, full documentation, short downtime, a user-friendly operator interface, automation, historical record keeping, and error indication and logging. A specially designed chip "SLINKY" was used as the test vehicle for the complete characterization of the Laser Restructuring system. With the use of Design of Experiment techniques the Lateral Diffused Link (LDL), developed originally at MIT Lincoln Laboratories, was completely characterized and for the first time a set of optimum process parameters was obtained. With the designed infrastructure fully operational, the priority objective was the search for a substitute for the high resistance, high current leakage to substrate, and relatively low density Lateral Diffused Link. A high density Laser Vertical Link with resistance values below 10 ohms was developed, studied and tested using design of experiment methodologies. The vertical link offers excellent advantages in the area of quick prototyping of electronic circuits, but even more important, due to having similar characteristics to a foundry produced via, it gives quick transfer from the prototype system verification stage to the mass production stage.

  12. The PROMIS physical function correlates with the QuickDASH in patients with upper extremity illness.

    PubMed

    Overbeek, Celeste L; Nota, Sjoerd P F T; Jayakumar, Prakash; Hageman, Michiel G; Ring, David

    2015-01-01

    To assess disability more efficiently with less burden on the patient, the National Institutes of Health has developed the Patient Reported Outcomes Measurement Information System (PROMIS) Physical Function-an instrument based on item response theory and using computer adaptive testing (CAT). Initially, upper and lower extremity disabilities were not separated and we were curious if the PROMIS Physical Function CAT could measure upper extremity disability and the Quick Disability of Arm, Shoulder and Hand (QuickDASH). We aimed to find correlation between the PROMIS Physical Function and the QuickDASH questionnaires in patients with upper extremity illness. Secondarily, we addressed whether the PROMIS Physical Function and QuickDASH correlate with the PROMIS Depression CAT and PROMIS Pain Interference CAT instruments. Finally, we assessed factors associated with QuickDASH and PROMIS Physical Function in multivariable analysis. A cohort of 93 outpatients with upper extremity illnesses completed the QuickDASH and three PROMIS CAT questionnaires: Physical Function, Pain Interference, and Depression. Pain intensity was measured with an 11-point ordinal measure (0-10 numeric rating scale). Correlation between PROMIS Physical Function and the QuickDASH was assessed. Factors that correlated with the PROMIS Physical Function and QuickDASH were assessed in multivariable regression analysis after initial bivariate analysis. There was a moderate correlation between the PROMIS Physical Function and the QuickDASH questionnaire (r=-0.55, p<0.001). Greater disability as measured with the PROMIS and QuickDASH correlated most strongly with PROMIS Depression (r=-0.35, p<0.001 and r=0.34, p<0.001 respectively) and Pain Interference (r=-0.51, p<0.001 and r=0.74, p<0.001 respectively). The factors accounting for the variability in PROMIS scores are comparable to those for the QuickDASH except that the PROMIS Physical Function is influenced by other pain conditions while the QuickDASH is not. The PROMIS Physical Function instrument may be used as an upper extremity disability measure, as it correlates with the QuickDASH questionnaire, and both instruments are influenced most strongly by the degree to which pain interferes with achieving goals. Level III, diagnostic study. See the Instructions for Authors for a complete description of levels of evidence.

  13. The Invasive Species Forecasting System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Most, Neal; Gill, Roger; Ma, Peter

    2011-01-01

    The Invasive Species Forecasting System (ISFS) provides computational support for the generic work processes found in many regional-scale ecosystem modeling applications. Decision support tools built using ISFS allow a user to load point occurrence field sample data for a plant species of interest and quickly generate habitat suitability maps for geographic regions of management concern, such as a national park, monument, forest, or refuge. This type of decision product helps resource managers plan invasive species protection, monitoring, and control strategies for the lands they manage. Until now, scientists and resource managers have lacked the data-assembly and computing capabilities to produce these maps quickly and cost efficiently. ISFS focuses on regional-scale habitat suitability modeling for invasive terrestrial plants. ISFS s component architecture emphasizes simplicity and adaptability. Its core services can be easily adapted to produce model-based decision support tools tailored to particular parks, monuments, forests, refuges, and related management units. ISFS can be used to build standalone run-time tools that require no connection to the Internet, as well as fully Internet-based decision support applications. ISFS provides the core data structures, operating system interfaces, network interfaces, and inter-component constraints comprising the canonical workflow for habitat suitability modeling. The predictors, analysis methods, and geographic extents involved in any particular model run are elements of the user space and arbitrarily configurable by the user. ISFS provides small, lightweight, readily hardened core components of general utility. These components can be adapted to unanticipated uses, are tailorable, and require at most a loosely coupled, nonproprietary connection to the Web. Users can invoke capabilities from a command line; programmers can integrate ISFS's core components into more complex systems and services. Taken together, these features enable a degree of decentralization and distributed ownership that have helped other types of scientific information services succeed in recent years.

  14. The expected results method for data verification

    NASA Astrophysics Data System (ADS)

    Monday, Paul

    2016-05-01

    The credibility of United States Army analytical experiments using distributed simulation depends on the quality of the simulation, the pedigree of the input data, and the appropriateness of the simulation system to the problem. The second of these factors is best met by using classified performance data from the Army Materiel Systems Analysis Activity (AMSAA) for essential battlefield behaviors, like sensors, weapon fire, and damage assessment. Until recently, using classified data has been a time-consuming and expensive endeavor: it requires significant technical expertise to load, and it is difficult to verify that it works correctly. Fortunately, new capabilities, tools, and processes are available that greatly reduce these costs. This paper will discuss these developments, a new method to verify that all of the components are configured and operate properly, and the application to recent Army Capabilities Integration Center (ARCIC) experiments. Recent developments have focused improving the process to load the data. OneSAF has redesigned their input data file formats and structures so that they correspond exactly with the Standard File Format (SFF) defined by AMSAA, ARCIC developed a library of supporting configurations that correlate directly to the AMSAA nomenclature, and the Entity Validation Tool was designed to quickly execute the essential models with a test-jig approach to identify problems with the loaded data. The missing part of the process is provided by the new Expected Results Method. Instead of the usual subjective assessment of quality, e.g., "It looks about right to me", this new approach compares the performance of a combat model with authoritative expectations to quickly verify that the model, data, and simulation are all working correctly. Integrated together, these developments now make it possible to use AMSAA classified performance data with minimal time and maximum assurance that the experiment's analytical results will be of the highest quality possible.

  15. ANNA: A Convolutional Neural Network Code for Spectroscopic Analysis

    NASA Astrophysics Data System (ADS)

    Lee-Brown, Donald; Anthony-Twarog, Barbara J.; Twarog, Bruce A.

    2018-01-01

    We present ANNA, a Python-based convolutional neural network code for the automated analysis of stellar spectra. ANNA provides a flexible framework that allows atmospheric parameters such as temperature and metallicity to be determined with accuracies comparable to those of established but less efficient techniques. ANNA performs its parameterization extremely quickly; typically several thousand spectra can be analyzed in less than a second. Additionally, the code incorporates features which greatly speed up the training process necessary for the neural network to measure spectra accurately, resulting in a tool that can easily be run on a single desktop or laptop computer. Thus, ANNA is useful in an era when spectrographs increasingly have the capability to collect dozens to hundreds of spectra each night. This talk will cover the basic features included in ANNA and demonstrate its performance in two use cases: an open cluster abundance analysis involving several hundred spectra, and a metal-rich field star study. Applicability of the code to large survey datasets will also be discussed.

  16. SUPIN: A Computational Tool for Supersonic Inlet Design

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2016-01-01

    A computational tool named SUPIN is being developed to design and analyze the aerodynamic performance of supersonic inlets. The inlet types available include the axisymmetric pitot, three-dimensional pitot, axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flow-field is divided into parts to provide a framework for the geometry and aerodynamic modeling. Each part of the inlet is defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick design and analysis. SUPIN provides inlet geometry in the form of coordinates, surface angles, and cross-sectional areas. SUPIN can generate inlet surface grids and three-dimensional, structured volume grids for use with higher-fidelity computational fluid dynamics (CFD) analysis. Capabilities highlighted in this paper include the design and analysis of streamline-traced external-compression inlets, modeling of porous bleed, and the design and analysis of mixed-compression inlets. CFD analyses are used to verify the SUPIN results.

  17. Generic trending and analysis system

    NASA Technical Reports Server (NTRS)

    Keehan, Lori; Reese, Jay

    1994-01-01

    The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.

  18. ISOON + SOLIS: Merging the Data Products

    NASA Astrophysics Data System (ADS)

    Radick, R.; Dalrymple, N.; Mozer, J.; Wiborg, P.; Harvey, J.; Henney, C.; Neidig, D.

    2005-05-01

    The combination of AFRL's ISOON and NSO's SOLIS offers significantly greater capability than the individual instruments. We are working toward merging the SOLIS and ISOON data products in a single central facility. The ISOON system currently includes both an observation facility and a remote analysis center (AC). The AC is capable of receiving data from both the ISOON observation facility as well as external sources. It archives the data and displays corrected images and time-lapse animations. The AC has a large number of digital tools that can be applied to solar images to provide quantitative information quickly and easily. Because of its convenient tools and ready archival capability, the ISOON AC is a natural place to merge products from SOLIS and ISOON. We have completed a preliminary integration of the ISOON and SOLIS data products. Eventually, we intend to distribute viewing stations to various users and academic institutions, install the AC software tools at a number of user locations, and publish ISOON/SOLIS data products jointly on a common web page. In addition, SOLIS data products, separately, are and will continue to be fully available on the NSO,s Digital Library and SOLIS web pages, and via the Virtual Solar Observatory. This work is being supported by the National Science Foundation and the Air Force Office of Scientific Research.

  19. Defining Tsunami Magnitude as Measure of Potential Impact

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Tang, L.

    2016-12-01

    The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.

  20. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model Integration), where it has been ap praised at maturity level 3.

  1. The role of the U.S. Army Medical Department in domestic disaster assistance operations - lessons learned from hurricane Andrew.

    DOT National Transportation Integrated Search

    1996-04-01

    Hurricane Andrew, which struck South Dade County, Florida on the morning of 24 August 1992, was the "worst natural disaster ever to hit the United States..." The capabilities of the local and state governments to respond to the disaster were quickly ...

  2. Quick-Change Ceramic Flame Holder for High-Output Torches

    NASA Technical Reports Server (NTRS)

    Haskin, Henry

    2010-01-01

    Researchers at NASA's Langley Research Center have developed a new ceramic design flame holder with a service temperature of 4,000 F (2,204 C). The combination of high strength and high temperature capability, as well as a twist-lock mounting method to the steel burner, sets this flame holder apart from existing technology.

  3. Chief of Naval Air Training Resource Planning System (RPS).

    ERIC Educational Resources Information Center

    Hodak, Gary W.; And Others

    The Resource Planning System (RPS) provides the Chief of Naval Air Training (CNATRA) with the capability to determine the resources required to produce a specified number of Naval Aviators and Naval Flight Officers (NAs/NFOs) quickly and efficiently. The training of NAs and NFOs is extremely time consuming and complex. It requires extensive…

  4. A Framework for Mobile Apps in Colleges and Universities: Data Mining Perspective

    ERIC Educational Resources Information Center

    Singh, Archana; Ranjan, Jayanthi

    2016-01-01

    The Enterprise mobility communication technology provides easy and quick accessibility to data and information integrated into one single touch point device. This device incorporates or integrates all the processes into small applications or App and thus increases the workforce capability of knowledge workers. "App" which is a small set…

  5. Large scale track analysis for wide area motion imagery surveillance

    NASA Astrophysics Data System (ADS)

    van Leeuwen, C. J.; van Huis, J. R.; Baan, J.

    2016-10-01

    Wide Area Motion Imagery (WAMI) enables image based surveillance of areas that can cover multiple square kilometers. Interpreting and analyzing information from such sources, becomes increasingly time consuming as more data is added from newly developed methods for information extraction. Captured from a moving Unmanned Aerial Vehicle (UAV), the high-resolution images allow detection and tracking of moving vehicles, but this is a highly challenging task. By using a chain of computer vision detectors and machine learning techniques, we are capable of producing high quality track information of more than 40 thousand vehicles per five minutes. When faced with such a vast number of vehicular tracks, it is useful for analysts to be able to quickly query information based on region of interest, color, maneuvers or other high-level types of information, to gain insight and find relevant activities in the flood of information. In this paper we propose a set of tools, combined in a graphical user interface, which allows data analysts to survey vehicles in a large observed area. In order to retrieve (parts of) images from the high-resolution data, we developed a multi-scale tile-based video file format that allows to quickly obtain only a part, or a sub-sampling of the original high resolution image. By storing tiles of a still image according to a predefined order, we can quickly retrieve a particular region of the image at any relevant scale, by skipping to the correct frames and reconstructing the image. Location based queries allow a user to select tracks around a particular region of interest such as landmark, building or street. By using an integrated search engine, users can quickly select tracks that are in the vicinity of locations of interest. Another time-reducing method when searching for a particular vehicle, is to filter on color or color intensity. Automatic maneuver detection adds information to the tracks that can be used to find vehicles based on their behavior.

  6. Process monitoring and control with CHEMIN, a miniaturized CCD-based instrument for simultaneous XRD/XRF analysis

    NASA Astrophysics Data System (ADS)

    Vaniman, David T.; Bish, D.; Guthrie, G.; Chipera, S.; Blake, David E.; Collins, S. Andy; Elliott, S. T.; Sarrazin, P.

    1999-10-01

    There is a large variety of mining and manufacturing operations where process monitoring and control can benefit from on-site analysis of both chemical and mineralogic constituents. CHEMIN is a CCD-based instrument capable of both X-ray fluorescence (XRF; chemical) and X-ray diffraction (XRD; mineralogic) analysis. Monitoring and control with an instrument like CHEMIN can be applied to feedstocks, intermediate materials, and final products to optimize production. Examples include control of cement feedstock, of ore for smelting, and of minerals that pose inhalation hazards in the workplace. The combined XRD/XRF capability of CHEMIN can be used wherever a desired commodity is associated with unwanted constituents that may be similar in chemistry or structure but not both (e.g., Ca in both gypsum and feldspar, where only the gypsum is desired to make wallboard). In the mining industry, CHEMIN can determine mineral abundances on the spot and enable more economical mining by providing the means to assay when is being mined, quickly and frequently, at minimal cost. In manufacturing, CHEMIN could be used to spot-check the chemical composition and crystalline makeup of a product at any stage of production. Analysis by CHEMIN can be used as feedback in manufacturing processes where rates of heating, process temperature, mixture of feedstocks, and other variables must be adjusted in real time to correct structure and/or chemistry of the product (e.g., prevention of periclase and alkali sulfate coproduction in cement manufacture).

  7. Interferometer for measuring the dynamic surface topography of a human tear film

    NASA Astrophysics Data System (ADS)

    Primeau, Brian C.; Greivenkamp, John E.

    2012-03-01

    The anterior refracting surface of the eye is the thin tear film that forms on the surface of the cornea. Following a blink, the tear film quickly smoothes and starts to become irregular after 10 seconds. This irregularity can affect comfort and vision quality. An in vivo method of characterizing dynamic tear films has been designed based upon a near-infrared phase-shifting interferometer. This interferometer continuously measures light reflected from the tear film, allowing sub-micron analysis of the dynamic surface topography. Movies showing the tear film behavior can be generated along with quantitative metrics describing changes in the tear film surface. This tear film measurement allows analysis beyond capabilities of typical fluorescein visual inspection or corneal topography and provides better sensitivity and resolution than shearing interferometry methods. The interferometer design is capable of identifying features in the tear film much less than a micron in height with a spatial resolution of about ten microns over a 6 mm diameter. This paper presents the design of the tear film interferometer along with the considerations that must be taken when designing an interferometer for on-eye diagnostics. Discussions include eye movement, design of null optics for a range of ocular geometries, and laser emission limits for on-eye interferometry.

  8. Funtools: Fits Users Need Tools for Quick, Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Mandel, Eric; Brederkamp, Joe (Technical Monitor)

    2001-01-01

    The Funtools project arose out of conversations with astronomers about the decline in their software development efforts over the past decade. A stated reason for this decline is that it takes too much effort to master one of the existing FITS libraries simply in order to write a few analysis programs. This problem is exacerbated by the fact that astronomers typically develop new programs only occasionally, and the long interval between coding efforts often necessitates re-learning the FITS interfaces. We therefore set ourselves the goal of developing a minimal buy-in FITS library for researchers who are occasional (but serious) coders. In this case, "minimal buy-in" meant "easy to learn, easy to use, and easy to re-learn next month". Based on conversations with astronomers interested in writing code, we concluded that this goal could be achieved by emphasizing two essential capabilities. The first was the ability to write FITS programs without knowing much about FITS, i.e., without having to deal with the arcane rules for generating a properly formatted FITS file. The second was to support the use of already-familiar C/Unix facilities, especially C structs and Unix stdio. Taken together, these two capabilities would allow researchers to leverage their existing programming expertise while minimizing the need to learn new and complex coding rules.

  9. Viewpoints: A High-Performance High-Dimensional Exploratory Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Gazis, P. R.; Levit, C.; Way, M. J.

    2010-12-01

    Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.

  10. Model-based diagnostics for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.

    1991-01-01

    An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.

  11. Rasch Model Analysis Gives New Insights Into the Structural Validity of the QuickDASH in Patients With Musculoskeletal Shoulder Pain.

    PubMed

    Jerosch-Herold, Christina; Chester, Rachel; Shepstone, Lee

    2017-09-01

    Study Design Cross-sectional secondary analysis of a prospective cohort study. Background The shortened version of the Disabilities of the Arm, Shoulder and Hand questionnaire (QuickDASH) is a widely used outcome measure that has been extensively evaluated using classical test theory. Rasch model analysis can identify strengths and weaknesses of rating scales and goes beyond classical test theory approaches. It uses a mathematical model to test the fit between the observed data and expected responses and converts ordinal-level scores into interval-level measurement. Objective To test the structural validity of the QuickDASH using Rasch analysis. Methods A prospective cohort study of 1030 patients with shoulder pain provided baseline data. Rasch analysis was conducted to (1) assess how the QuickDASH fits the Rasch model, (2) identify sources of misfit, and (3) explore potential solutions to these. Results There was evidence of multidimensionality and significant misfit to the Rasch model (χ 2 = 331.09, P<.001). Two items had disordered threshold responses with strong floor effects. Response bias was detected in most items for age and sex. Rescoring resulted in ordered thresholds; however, the 11-item scale still did not meet the expectations of the Rasch model. Conclusion Rasch model analysis on the QuickDASH has identified a number of problems that cannot be easily detected using traditional analyses. While revisions to the QuickDASH resulted in better fit, a "shoulder-specific" version is not advocated at present. Caution needs to be exercised when interpreting results of the QuickDASH outcome measure, as it does not meet the criteria for interval-level measurement and shows significant response bias by age and sex. J Orthop Sports Phys Ther 2017;47(9):664-672. Epub 13 Jul 2017. doi:10.2519/jospt.2017.7288.

  12. [Assessment of pragmatics from verbal spoken data].

    PubMed

    Gallardo-Paúls, B

    2009-02-27

    Pragmatic assessment is usually complex, long and sophisticated, especially for professionals who lack specific linguistic education and interact with impaired speakers. To design a quick method of assessment that will provide a quick general evaluation of the pragmatic effectiveness of neurologically affected speakers. This first filter will allow us to decide whether a detailed analysis of the altered categories should follow. Our starting point was the PerLA (perception, language and aphasia) profile of pragmatic assessment designed for the comprehensive analysis of conversational data in clinical linguistics; this was then converted into a quick questionnaire. A quick protocol of pragmatic assessment is proposed and the results found in a group of children with attention deficit hyperactivity disorder are discussed.

  13. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Liang, Zhidan; McGuinness, Kenneth N.; Crespo, Alejandro; Zhong, Wendy

    2018-05-01

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. [Figure not available: see fulltext.

  14. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software

    NASA Astrophysics Data System (ADS)

    Liang, Zhidan; McGuinness, Kenneth N.; Crespo, Alejandro; Zhong, Wendy

    2018-01-01

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. [Figure not available: see fulltext.

  15. Development of WMS Capabilities to Support NASA Disasters Applications and App Development

    NASA Astrophysics Data System (ADS)

    Bell, J. R.; Burks, J. E.; Molthan, A.; McGrath, K. M.

    2013-12-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  16. Development of WMS Capabilities to Support NASA Disasters Applications and App Development

    NASA Technical Reports Server (NTRS)

    Bell, Jordan R.; Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2013-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  17. Development of Web Mapping Service Capabilities to Support NASA Disasters Applications/App Development

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2014-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  18. Development of Web Mapping Service Capabilities to Support NASA Disasters Applications / App Development

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Molthan, Andrew L.; McGrath, Kevin M.

    2014-01-01

    During the last year several significant disasters have occurred such as Superstorm Sandy on the East coast of the United States, and Typhoon Bopha in the Phillipines, along with several others. In support of these disasters NASA's Short-term Prediction Research and Transition (SPoRT) Center delivered various products derived from satellite imagery to help in the assessment of damage and recovery of the affected areas. To better support the decision makers responding to the disasters SPoRT quickly developed several solutions to provide the data using open Geographical Information Service (GIS) formats. Providing the data in open GIS standard formats allowed the end user to easily integrate the data into existing Decision Support Systems (DSS). Both Tile Mapping Service (TMS) and Web Mapping Service (WMS) were leveraged to quickly provide the data to the end-user. Development of the deliver methodology allowed quick response to rapidly developing disasters and enabled NASA SPoRT to bring science data to decision makers in a successful research to operations transition.

  19. Summary of photovoltaic system performance models

    NASA Technical Reports Server (NTRS)

    Smith, J. H.; Reiter, L. J.

    1984-01-01

    A detailed overview of photovoltaics (PV) performance modeling capabilities developed for analyzing PV system and component design and policy issues is provided. A set of 10 performance models are selected which span a representative range of capabilities from generalized first order calculations to highly specialized electrical network simulations. A set of performance modeling topics and characteristics is defined and used to examine some of the major issues associated with photovoltaic performance modeling. Each of the models is described in the context of these topics and characteristics to assess its purpose, approach, and level of detail. The issues are discussed in terms of the range of model capabilities available and summarized in tabular form for quick reference. The models are grouped into categories to illustrate their purposes and perspectives.

  20. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  1. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  2. Non-destructive, high-content analysis of wheat grain traits using X-ray micro computed tomography.

    PubMed

    Hughes, Nathan; Askew, Karen; Scotson, Callum P; Williams, Kevin; Sauze, Colin; Corke, Fiona; Doonan, John H; Nibau, Candida

    2017-01-01

    Wheat is one of the most widely grown crop in temperate climates for food and animal feed. In order to meet the demands of the predicted population increase in an ever-changing climate, wheat production needs to dramatically increase. Spike and grain traits are critical determinants of final yield and grain uniformity a commercially desired trait, but their analysis is laborious and often requires destructive harvest. One of the current challenges is to develop an accurate, non-destructive method for spike and grain trait analysis capable of handling large populations. In this study we describe the development of a robust method for the accurate extraction and measurement of spike and grain morphometric parameters from images acquired by X-ray micro-computed tomography (μCT). The image analysis pipeline developed automatically identifies plant material of interest in μCT images, performs image analysis, and extracts morphometric data. As a proof of principle, this integrated methodology was used to analyse the spikes from a population of wheat plants subjected to high temperatures under two different water regimes. Temperature has a negative effect on spike height and grain number with the middle of the spike being the most affected region. The data also confirmed that increased grain volume was correlated with the decrease in grain number under mild stress. Being able to quickly measure plant phenotypes in a non-destructive manner is crucial to advance our understanding of gene function and the effects of the environment. We report on the development of an image analysis pipeline capable of accurately and reliably extracting spike and grain traits from crops without the loss of positional information. This methodology was applied to the analysis of wheat spikes can be readily applied to other economically important crop species.

  3. Development and Evaluation of a Virtual Campus on Second Life: The Case of SecondDMI

    ERIC Educational Resources Information Center

    De Lucia, Andrea; Francese, Rita; Passero, Ignazio; Tortora, Genoveffa

    2009-01-01

    Video games and new communication metaphors are quickly changing today's young people habits. Considering the actual e-learning scenarios, embedded in a fully technological enabled environment it is crucial to take advantage of this kind of capabilities to let learning process gain best results. This paper presents a virtual campus created using…

  4. A hybrid symbolic/finite-element algorithm for solving nonlinear optimal control problems

    NASA Technical Reports Server (NTRS)

    Bless, Robert R.; Hodges, Dewey H.

    1991-01-01

    The general code described is capable of solving difficult nonlinear optimal control problems by using finite elements and a symbolic manipulator. Quick and accurate solutions are obtained with a minimum for user interaction. Since no user programming is required for most problems, there are tremendous savings to be gained in terms of time and money.

  5. Lightweight Valve Closes Duct Quickly

    NASA Technical Reports Server (NTRS)

    Fournier, Walter L.; Burgy, N. Frank

    1991-01-01

    Expanding balloon serves as lightweight emergency valve to close wide duct. Uninflated balloon stored in housing of duct. Pad resting on burst diaphragm protects balloon from hot gases in duct. Once control system triggers valve, balloon inflates rapidly to block duct. Weighs much less than does conventional butterfly, hot-gas, or poppet valve capable of closing duct of equal diameter.

  6. PARENT Quick Blind Round-Robin Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.

    The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less

  7. Model-Based Diagnostics for Propellant Loading Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew John; Foygel, Michael; Smelyanskiy, Vadim N.

    2011-01-01

    The loading of spacecraft propellants is a complex, risky operation. Therefore, diagnostic solutions are necessary to quickly identify when a fault occurs, so that recovery actions can be taken or an abort procedure can be initiated. Model-based diagnosis solutions, established using an in-depth analysis and understanding of the underlying physical processes, offer the advanced capability to quickly detect and isolate faults, identify their severity, and predict their effects on system performance. We develop a physics-based model of a cryogenic propellant loading system, which describes the complex dynamics of liquid hydrogen filling from a storage tank to an external vehicle tank, as well as the influence of different faults on this process. The model takes into account the main physical processes such as highly nonequilibrium condensation and evaporation of the hydrogen vapor, pressurization, and also the dynamics of liquid hydrogen and vapor flows inside the system in the presence of helium gas. Since the model incorporates multiple faults in the system, it provides a suitable framework for model-based diagnostics and prognostics algorithms. Using this model, we analyze the effects of faults on the system, derive symbolic fault signatures for the purposes of fault isolation, and perform fault identification using a particle filter approach. We demonstrate the detection, isolation, and identification of a number of faults using simulation-based experiments.

  8. MSFR TRU-burning potential and comparison with an SFR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fiorina, C.; Cammi, A.; Franceschini, F.

    2013-07-01

    The objective of this work is to evaluate the Molten Salt Fast Reactor (MSFR) potential benefits in terms of transuranics (TRU) burning through a comparative analysis with a sodium-cooled FR. The comparison is based on TRU- and MA-burning rates, as well as on the in-core evolution of radiotoxicity and decay heat. Solubility issues limit the TRU-burning rate to 1/3 that achievable in traditional low-CR FRs (low-Conversion-Ratio Fast Reactors). The softer spectrum also determines notable radiotoxicity and decay heat of the equilibrium actinide inventory. On the other hand, the liquid fuel suggests the possibility of using a Pu-free feed composed onlymore » of Th and MA (Minor Actinides), thus maximizing the MA burning rate. This is generally not possible in traditional low-CR FRs due to safety deterioration and decay heat of reprocessed fuel. In addition, the high specific power and the lack of out-of-core cooling times foster a quick transition toward equilibrium, which improves the MSFR capability to burn an initial fissile loading, and makes the MSFR a promising system for a quick (i.e., in a reactor lifetime) transition from the current U-based fuel cycle to a novel closed Th cycle. (authors)« less

  9. Synthesis of research on work zone delays and simplified application of QuickZone analysis tool.

    DOT National Transportation Integrated Search

    2010-03-01

    The objectives of this project were to synthesize the latest information on work zone safety and management and identify case studies in which FHWAs decision support tool QuickZone or other appropriate analysis tools could be applied. The results ...

  10. NASA's Space Launch System (SLS) Program: Mars Program Utilization

    NASA Technical Reports Server (NTRS)

    May, Todd A.; Creech, Stephen D.

    2012-01-01

    NASA's Space Launch System is being designed for safe, affordable, and sustainable human and scientific exploration missions beyond Earth's orbit (BEO), as directed by the NASA Authorization Act of 2010 and NASA's 2011 Strategic Plan. This paper describes how the SLS can dramatically change the Mars program's science and human exploration capabilities and objectives. Specifically, through its high-velocity change (delta V) and payload capabilities, SLS enables Mars science missions of unprecedented size and scope. By providing direct trajectories to Mars, SLS eliminates the need for complicated gravity-assist missions around other bodies in the solar system, reducing mission time, complexity, and cost. SLS's large payload capacity also allows for larger, more capable spacecraft or landers with more instruments, which can eliminate the need for complex packaging or "folding" mechanisms. By offering this capability, SLS can enable more science to be done more quickly than would be possible through other delivery mechanisms using longer mission times.

  11. The Renovation and Future Capabilities of the Thacher Observatory

    NASA Astrophysics Data System (ADS)

    O'Neill, Katie; Osuna, Natalie; Edwards, Nick; Klink, Douglas; Swift, Jonathan; Vyhnal, Chris; Meyer, Kurt

    2016-01-01

    The Thacher School is in the process of renovating the campus observatory with a new meter class telescope and full automation capabilities for the purpose of scientific research and education. New equipment on site has provided a preliminary site characterization including seeing and V-band sky brightness measurements. These data, along with commissioning data from the MINERVA project (which uses comparable hardware) are used to estimate the capabilities of the observatory once renovation is complete. Our V-band limiting magnitude is expected to be better than 21.3 for a one minute integration time, and we estimate that milli-magnitude precision photometry will be possible for a V=14.5 point source over approximately 5 min timescales. The quick response, autonomous operation, and multi-band photometric capabilities of the renovated observatory will make it a powerful follow-up science facility for exoplanets, eclipsing binaries, near-Earth objects, stellar variability, and supernovae.

  12. Implementation of the high-order schemes QUICK and LECUSSO in the COMMIX-1C Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakai, K.; Sun, J.G.; Sha, W.T.

    Multidimensional analysis computer programs based on the finite volume method, such as COMMIX-1C, have been commonly used to simulate thermal-hydraulic phenomena in engineering systems such as nuclear reactors. In COMMIX-1C, the first-order schemes with respect to both space and time are used. In many situations such as flow recirculations and stratifications with steep gradient of velocity and temperature fields, however, high-order difference schemes are necessary for an accurate prediction of the fields. For these reasons, two second-order finite difference numerical schemes, QUICK (Quadratic Upstream Interpolation for Convective Kinematics) and LECUSSO (Local Exact Consistent Upwind Scheme of Second Order), have beenmore » implemented in the COMMIX-1C computer code. The formulations were derived for general three-dimensional flows with nonuniform grid sizes. Numerical oscillation analyses for QUICK and LECUSSO were performed. To damp the unphysical oscillations which occur in calculations with high-order schemes at high mesh Reynolds numbers, a new FRAM (Filtering Remedy and Methodology) scheme was developed and implemented. To be consistent with the high-order schemes, the pressure equation and the boundary conditions for all the conservation equations were also modified to be of second order. The new capabilities in the code are listed. Test calculations were performed to validate the implementation of the high-order schemes. They include the test of the one-dimensional nonlinear Burgers equation, two-dimensional scalar transport in two impinging streams, von Karmann vortex shedding, shear driven cavity flow, Couette flow, and circular pipe flow. The calculated results were compared with available data; the agreement is good.« less

  13. Methodology for conceptual remote sensing spacecraft technology: insertion analysis balancing performance, cost, and risk

    NASA Astrophysics Data System (ADS)

    Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.

    1997-12-01

    Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.

  14. Low Emissions RQL Flametube Combustor Component Test Results

    NASA Technical Reports Server (NTRS)

    Holdeman, James D.; Chang, Clarence T.

    2001-01-01

    This report describes and summarizes elements of the High Speed Research (HSR) Low Emissions Rich burn/Quick mix/Lean burn (RQL) flame tube combustor test program. This test program was performed at NASA Glenn Research Center circa 1992. The overall objective of this test program was to demonstrate and evaluate the capability of the RQL combustor concept for High Speed Civil Transport (HSCT) applications with the goal of achieving NOx emission index levels of 5 g/kg-fuel at representative HSCT supersonic cruise conditions. The specific objectives of the tests reported herein were to investigate component performance of the RQL combustor concept for use in the evolution of ultra-low NOx combustor design tools. Test results indicated that the RQL combustor emissions and performance at simulated supersonic cruise conditions were predominantly sensitive to the quick mixer subcomponent performance and not sensitive to fuel injector performance. Test results also indicated the mixing section configuration employing a single row of circular holes was the lowest NOx mixer tested probably due to the initial fast mixing characteristics of this mixing section. However, other quick mix orifice configurations such as the slanted slot mixer produced substantially lower levels of carbon monoxide emissions most likely due to the enhanced circumferential dispersion of the air addition. Test results also suggested that an optimum momentum-flux ratio exists for a given quick mix configuration. This would cause undesirable jet under- or over-penetration for test conditions with momentum-flux ratios below or above the optimum value. Tests conducted to assess the effect of quick mix flow area indicated that reduction in the quick mix flow area produced lower NOx emissions at reduced residence time, but this had no effect on NOx emissions measured at similar residence time for the configurations tested.

  15. The quick wins paradox.

    PubMed

    Van Buren, Mark E; Safferstone, Todd

    2009-01-01

    Many leaders taking on new roles try to prove themselves early on by going after quick wins--fresh, visible contributions to the business. But in the pursuit of early results, those leaders often fall into traps that prevent them from benefiting from their achievements. To succeed in their new positions, leaders must realize that the teams they have inherited are also experiencing change. Instead of focusing on an individual accomplishment, leaders need to work with team members on a collective quick win. In a study of more than 5,400 new leaders, the authors found that those who were struggling tended to exhibit five behaviors characteristic of people overly intent on securing a quick win. They focused too much on details, reacted negatively to criticism, intimidated others, jumped to conclusions, and micromanaged their direct reports. Some managed to eke out a win anyway, but the fallout was often toxic. The leaders who were thriving in their new roles, by contrast, shared not only a strong focus on results--necessary for early successes--but also excellent change-management skills. They communicated a clear vision, developed constructive relationships, and built team capabilities. They seemed to realize that the lasting value of their accomplishment would be the way they managed their teams through the transition. Collective quick wins established credibility and prepared them to lead their teams to harder-won victories. The authors provide a diagnostic tool for identifying opportunities for collective quick wins, and they share some advice for organizations: When grooming new leaders, don't just shore up their domain knowledge and technical skills; help them develop the change-management skills they will need as they settle in with their new teams.

  16. Real-time face and gesture analysis for human-robot interaction

    NASA Astrophysics Data System (ADS)

    Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd

    2010-05-01

    Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.

  17. Biomimetic Self-Healing

    DTIC Science & Technology

    2015-07-21

    typically degrade quickly and are not capable of forming new bonds. In the 1930s it was already found that vulcanized rubber could self - heal in the...To overcome this limitation, Diesendruck et al. demonstrated Scheme 1. Mechanochemical scission and self - healing in vulcanized rubber . Long-lived...effective autonomic self - healing for soft materials. Cordier et al. prepared supramolecular rubbers based on hydrogen bonding between urea-functionalized

  18. CHIPS. Volume 29, Issue 2, April - June 2011

    DTIC Science & Technology

    2011-06-01

    CHIPS www.chips.navy.mil Dedicated to Sharing Information - Technology - Experience In an orchestra, each musician produces exquisite music ... Development Command, talks about the capabilities of the Navy Center for Advanced Modeling and Simulation, its value to naval, joint and coalition...Strategic Communications The Seawater Antenna By Holly Quick Developing a New Model for Maritime Tactical Information Dominance By Capt. Danelle

  19. The Effect of High School Junior Reserve Officers' Training Corps (JROTC) on Civic Knowledge, Skills, and Attitudes of Hispanic Cadets

    ERIC Educational Resources Information Center

    Loving, Kirk Anthony

    2017-01-01

    As students continue to experience low test scores on national civics assessments, it is important to identify curriculum which can increase their civic capabilities. This is especially true for the quickly growing Hispanic population, which suffers a civic achievement gap. The purpose of this quantitative quasi-experimental nonequivalent…

  20. Next Generation MODTRAN for Improved Atmospheric Correction of Spectral Imagery

    DTIC Science & Technology

    2016-01-29

    DoD operational and research sensor and data processing systems, particularly those involving the removal of atmospheric effects, commonly referred...atmospheric correction process. Given the ever increasing capabilities of spectral sensors to quickly generate enormous quantities of data, combined...many DoD operational and research sensor and data processing systems, particularly those involving the removal of atmospheric effects, commonly

  1. The composite load spectra project

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.; Kurth, R. E.

    1990-01-01

    Probabilistic methods and generic load models capable of simulating the load spectra that are induced in space propulsion system components are being developed. Four engine component types (the transfer ducts, the turbine blades, the liquid oxygen posts and the turbopump oxidizer discharge duct) were selected as representative hardware examples. The composite load spectra that simulate the probabilistic loads for these components are typically used as the input loads for a probabilistic structural analysis. The knowledge-based system approach used for the composite load spectra project provides an ideal environment for incremental development. The intelligent database paradigm employed in developing the expert system provides a smooth coupling between the numerical processing and the symbolic (information) processing. Large volumes of engine load information and engineering data are stored in database format and managed by a database management system. Numerical procedures for probabilistic load simulation and database management functions are controlled by rule modules. Rules were hard-wired as decision trees into rule modules to perform process control tasks. There are modules to retrieve load information and models. There are modules to select loads and models to carry out quick load calculations or make an input file for full duty-cycle time dependent load simulation. The composite load spectra load expert system implemented today is capable of performing intelligent rocket engine load spectra simulation. Further development of the expert system will provide tutorial capability for users to learn from it.

  2. Quick Attach Docking Interface for Lunar Electric Rover

    NASA Technical Reports Server (NTRS)

    Schuler, Jason M.; Nick, Andrew J.; Immer, Christopher; Mueller, Robert P.

    2010-01-01

    The NASA Lunar Electric Rover (LER) has been developed at Johnson Space Center as a next generation mobility platform. Based upon a twelve wheel omni-directional chassis with active suspension the LER introduces a number of novel capabilities for lunar exploration in both manned and unmanned scenarios. Besides being the primary vehicle for astronauts on the lunar surface, LER will perform tasks such as lunar regolith handling (to include dozing, grading, and excavation), equipment transport, and science operations. In an effort to support these additional tasks a team at the Kennedy Space Center has produced a universal attachment interface for LER known as the Quick Attach. The Quick Attach is a compact system that has been retro-fitted to the rear of the LER giving it the ability to dock and undock on the fly with various implements. The Quick Attach utilizes a two stage docking approach; the first is a mechanical mate which aligns and latches a passive set of hooks on an implement with an actuated cam surface on LER. The mechanical stage is tolerant to misalignment between the implement and the LER during docking and once the implement is captured a preload is applied to ensure a positive lock. The second stage is an umbilical connection which consists of a dust resistant enclosure housing a compliant mechanism that is optionally actuated to mate electrical and fluid connections for suitable implements. The Quick Attach system was designed with the largest foreseen input loads considered including excavation operations and large mass utility attachments. The Quick Attach system was demonstrated at the Desert Research And Technology Studies (D-RA TS) field test in Flagstaff, AZ along with the lightweight dozer blade LANCE. The LANCE blade is the first implement to utilize the Quick Attach interface and demonstrated the tolerance, speed, and strength of the system in a lunar analog environment.

  3. Optimisation of nasal swab analysis by liquid scintillation counting.

    PubMed

    Dai, Xiongxin; Liblong, Aaron; Kramer-Tremblay, Sheila; Priest, Nicholas; Li, Chunsheng

    2012-06-01

    When responding to an emergency radiological incident, rapid methods are needed to provide the physicians and radiation protection personnel with an early estimation of possible internal dose resulting from the inhalation of radionuclides. This information is needed so that appropriate medical treatment and radiological protection control procedures can be implemented. Nasal swab analysis, which employs swabs swiped inside a nostril followed by liquid scintillation counting of alpha and beta activity on the swab, could provide valuable information to quickly identify contamination of the affected population. In this study, various parameters (such as alpha/beta discrimination, swab materials, counting time and volume of scintillation cocktail etc) were evaluated in order to optimise the effectiveness of the nasal swab analysis method. An improved nasal swab procedure was developed by replacing cotton swabs with polyurethane-tipped swabs. Liquid scintillation counting was performed using a Hidex 300SL counter with alpha/beta pulse shape discrimination capability. Results show that the new method is more reliable than existing methods using cotton swabs and effectively meets the analysis requirements for screening personnel in an emergency situation. This swab analysis procedure is also applicable to wipe tests of surface contamination to minimise the source self-absorption effect on liquid scintillation counting.

  4. The IUE Science Operations Ground System

    NASA Technical Reports Server (NTRS)

    Pitts, Ronald E.; Arquilla, Richard

    1994-01-01

    The International Ultraviolet Explorer (IUE) Science Operations System provides full realtime operations capabilities and support to the operations staff and astronomer users. The components of this very diverse and extremely flexible hardware and software system have played a major role in maintaining the scientific efficiency and productivity of the IUE. The software provides the staff and user with all the tools necessary for pre-visit and real-time planning and operations analysis for any day of the year. Examples of such tools include the effects of spacecraft constraints on target availability, maneuver times between targets, availability of guide stars, target identification, coordinate transforms, e-mail transfer of Observatory forms and messages, and quick-look analysis of image data. Most of this extensive software package can also be accessed remotely by individual users for information, scheduling of shifts, pre-visit planning, and actual observing program execution. Astronomers, with a modest investment in hardware and software, may establish remote observing sites. We currently have over 20 such sites in our remote observers' network.

  5. A Model for Dynamic Simulation and Analysis of Tether Momentum Exchange

    NASA Technical Reports Server (NTRS)

    Canfield, Stephen; Johnson, David; Sorensen, Kirk; Welzyn, Ken; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Momentum-exchange/electrodynamic reboost (MXER) tether systems may enable high-energy missions to the Moon, Mars, and beyond by serving as an 'upper stage in space'. Existing rockets that use an MXER tether station could double their capability to launch communications satellites and help improve US competitiveness. A MXER tether station would boost spacecraft from low Earth orbit to a high-energy orbit quickly, like a high-thrust rocket. Then, using the same principles that make an electric motor work, it would slowly rebuild its orbital momentum by pushing against the Earth's magnetic field-without using any propellant. One of the significant challenges in developing a momentum-exchange/electrodynamic reboost tether systems is in the analysis and design of the capture mechanism and its effects on the overall dynamics of the system. This paper will present a model for a momentum-exchange tether system that can simulate and evaluate the performance and requirements of such a system.

  6. Nose-to-tail analysis of an airbreathing hypersonic vehicle using an in-house simplified tool

    NASA Astrophysics Data System (ADS)

    Piscitelli, Filomena; Cutrone, Luigi; Pezzella, Giuseppe; Roncioni, Pietro; Marini, Marco

    2017-07-01

    SPREAD (Scramjet PREliminary Aerothermodynamic Design) is a simplified, in-house method developed by CIRA (Italian Aerospace Research Centre), able to provide a preliminary estimation of the performance of engine/aeroshape for airbreathing configurations. It is especially useful for scramjet engines, for which the strong coupling between the aerothermodynamic (external) and propulsive (internal) flow fields requires real-time screening of several engine/aeroshape configurations and the identification of the most promising one/s with respect to user-defined constraints and requirements. The outcome of this tool defines the base-line configuration for further design analyses with more accurate tools, e.g., CFD simulations and wind tunnel testing. SPREAD tool has been used to perform the nose-to-tail analysis of the LAPCAT-II Mach 8 MR2.4 vehicle configuration. The numerical results demonstrate SPREAD capability to quickly predict reliable values of aero-propulsive balance (i.e., net-thrust) and aerodynamic efficiency in a pre-design phase.

  7. Detection of enteropathogenic Escherichia coli by microchip capillary electrophoresis.

    PubMed

    Law, Wai S; Li, Sam F Y; Kricka, Larry J

    2009-01-01

    There is always a need to detect the presence of microorganisms, either as contaminants in food and pharmaceutical industries or bioindicators for disease diagnosis. Hence, it is important to develop efficient, rapid, and simple methods to detect microorganisms. Traditional culturing method is unsatisfactory due to its long incubation time. Molecular methods, although capable of providing a high degree of specificity, are not always useful in providing quick tests of presence or absence of microorganisms. Microchip elec-trophoresis has been recently employed to address problems associated with the detection of microorganisms due to its high versatility, selectivity, sensitivity, and short analysis times. In this work, the potential of PDMS-based microchip electrophoresis in the identification and characterization of microorganism was evaluated. Enteropathogenic E. coli (EPEC) was selected as the model microorganism. To obtain repeat-able separations, sample pretreatment was found to be essential. Microchip electrophoresis with laser-induced fluorescence detection could potentially revolutionize certain aspects of microbiology involving diagnosis, profiling of pathogens, environmental analysis, and many others areas of study.

  8. Design and analysis of the trapeziform and flat acoustic cloaks with controllable invisibility performance in a quasi-space

    NASA Astrophysics Data System (ADS)

    Zhu, Jian; Chen, Tianning; Liang, Qingxuan; Wang, Xiaopeng; Xiong, Jie; Jiang, Ping

    2015-07-01

    We present the design, implementation and detailed performance analysis for a class of trapeziform and flat acoustic cloaks. An effective large invisible area is obtained compared with the traditional carpet cloak. The cloaks are realized with homogeneous metamaterials which are made of periodic arrangements of subwavelength unit cells composed of steel embedded in air. The microstructures and its effective parameters of the cloaks are determined quickly and precisely in a broadband frequency range by using the effective medium theory and the proposed parameters optimization method. The invisibility capability of the cloaks can be controlled by the variation of the key design parameters and scale factor which are proved to have more influence on the performance in the near field than that in the far field. Different designs are suitable for different application situations. Good cloaking performance demonstrates that such a device can be physically realized with natural materials which will greatly promote the real applications of invisibility cloak.

  9. The X-33 range Operations Control Center

    NASA Technical Reports Server (NTRS)

    Shy, Karla S.; Norman, Cynthia L.

    1998-01-01

    This paper describes the capabilities and features of the X-33 Range Operations Center at NASA Dryden Flight Research Center. All the unprocessed data will be collected and transmitted over fiber optic lines to the Lockheed Operations Control Center for real-time flight monitoring of the X-33 vehicle. By using the existing capabilities of the Western Aeronautical Test Range, the Range Operations Center will provide the ability to monitor all down-range tracking sites for the Extended Test Range systems. In addition to radar tracking and aircraft telemetry data, the Telemetry and Radar Acquisition and Processing System is being enhanced to acquire vehicle command data, differential Global Positioning System corrections and telemetry receiver signal level status. The Telemetry and Radar Acquisition Processing System provides the flexibility to satisfy all X-33 data processing requirements quickly and efficiently. Additionally, the Telemetry and Radar Acquisition Processing System will run a real-time link margin analysis program. The results of this model will be compared in real-time with actual flight data. The hardware and software concepts presented in this paper describe a method of merging all types of data into a common database for real-time display in the Range Operations Center in support of the X-33 program. All types of data will be processed for real-time analysis and display of the range system status to ensure public safety.

  10. Drogue tracking using 3D flash lidar for autonomous aerial refueling

    NASA Astrophysics Data System (ADS)

    Chen, Chao-I.; Stettner, Roger

    2011-06-01

    Autonomous aerial refueling (AAR) is an important capability for an unmanned aerial vehicle (UAV) to increase its flying range and endurance without increasing its size. This paper presents a novel tracking method that utilizes both 2D intensity and 3D point-cloud data acquired with a 3D Flash LIDAR sensor to establish relative position and orientation between the receiver vehicle and drogue during an aerial refueling process. Unlike classic, vision-based sensors, a 3D Flash LIDAR sensor can provide 3D point-cloud data in real time without motion blur, in the day or night, and is capable of imaging through fog and clouds. The proposed method segments out the drogue through 2D analysis and estimates the center of the drogue from 3D point-cloud data for flight trajectory determination. A level-set front propagation routine is first employed to identify the target of interest and establish its silhouette information. Sufficient domain knowledge, such as the size of the drogue and the expected operable distance, is integrated into our approach to quickly eliminate unlikely target candidates. A statistical analysis along with a random sample consensus (RANSAC) is performed on the target to reduce noise and estimate the center of the drogue after all 3D points on the drogue are identified. The estimated center and drogue silhouette serve as the seed points to efficiently locate the target in the next frame.

  11. Automation of fluorescent differential display with digital readout.

    PubMed

    Meade, Jonathan D; Cho, Yong-Jig; Fisher, Jeffrey S; Walden, Jamie C; Guo, Zhen; Liang, Peng

    2006-01-01

    Since its invention in 1992, differential display (DD) has become the most commonly used technique for identifying differentially expressed genes because of its many advantages over competing technologies such as DNA microarray, serial analysis of gene expression (SAGE), and subtractive hybridization. Despite the great impact of the method on biomedical research, there has been a lack of automation of DD technology to increase its throughput and accuracy for systematic gene expression analysis. Most of previous DD work has taken a "shot-gun" approach of identifying one gene at a time, with a limited number of polymerase chain reaction (PCR) reactions set up manually, giving DD a low-tech and low-throughput image. We have optimized the DD process with a new platform that incorporates fluorescent digital readout, automated liquid handling, and large-format gels capable of running entire 96-well plates. The resulting streamlined fluorescent DD (FDD) technology offers an unprecedented accuracy, sensitivity, and throughput in comprehensive and quantitative analysis of gene expression. These major improvements will allow researchers to find differentially expressed genes of interest, both known and novel, quickly and easily.

  12. Time Series Analysis of the Quasar PKS 1749+096

    NASA Astrophysics Data System (ADS)

    Lam, Michael T.; Balonek, T. J.

    2011-01-01

    Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enders, Alexander L.; Lousteau, Angela L.

    The Desktop Analysis Reporting Tool (DART) is a software package that allows users to easily view and analyze daily files that span long periods. DART gives users the capability to quickly determine the state of health of a radiation portal monitor (RPM), troubleshoot and diagnose problems, and view data in various time frames to perform trend analysis. In short, it converts the data strings written in the daily files into meaningful tables and plots. The standalone version of DART (“soloDART”) utilizes a database engine that is included with the application; no additional installations are necessary. There is also a networkedmore » version of DART (“polyDART”) that is designed to maximize the benefit of a centralized data repository while distributing the workload to individual desktop machines. This networked approach requires a more complex database manager Structured Query Language (SQL) Server; however, SQL Server is not currently provided with DART. Regardless of which version is used, DART will import daily files from RPMs, store the relevant data in its database, and it can produce reports for status, trend analysis, and reporting purposes.« less

  14. Laser penetration spike welding: a welding tool enabling novel process and design opportunities

    NASA Astrophysics Data System (ADS)

    Dijken, Durandus K.; Hoving, Willem; De Hosson, J. Th. M.

    2002-06-01

    A novel method for laser welding for sheet metal. is presented. This laser spike welding method is capable of bridging large gaps between sheet metal plates. Novel constructions can be designed and manufactured. Examples are light weight metal epoxy multi-layers and constructions having additional strength with respect to rigidity and impact resistance. Its capability to bridge large gaps allows higher dimensional tolerances in production. The required laser systems are commercially available and are easily implemented in existing production lines. The lasers are highly reliable, the resulting spike welds are quickly realized and the cost price per weld is very low.

  15. Optical frequency standards for gravitational wave detection using satellite velocimetry

    NASA Astrophysics Data System (ADS)

    Vutha, Amar

    2015-04-01

    Satellite Doppler velocimetry, building on the work of Kaufmann and Estabrook and Wahlquist, is a complementary technique to interferometric methods of gravitational wave detection. This method is based on the fact that the gravitational wave amplitude appears in the apparent Doppler shift of photons propagating from an emitter to a receiver. This apparent Doppler shift can be resolved provided that a frequency standard, capable of quickly averaging down to a high stability, is available. We present a design for a space-capable optical atomic frequency standard, and analyze the sensitivity of satellite Doppler velocimetry for gravitational wave astronomy in the milli-hertz frequency band.

  16. Development of a simple, self-contained flight test data acquisition system

    NASA Technical Reports Server (NTRS)

    Clarke, R.; Shane, D.; Roskam, J.; Rummer, D. I.

    1982-01-01

    The flight test system described combines state-of-the-art microprocessor technology and high accuracy instrumentation with parameter identification technology which minimize data and flight time requirements. The system was designed to avoid permanent modifications of the test airplane and allow quick installation. It is capable of longitudinal and lateral-directional stability and control derivative estimation. Details of this system, calibration and flight test procedures, and the results of the Cessna 172 flight test program are presented. The system proved easy to install, simple to operate, and capable of accurate estimation of stability and control parameters in the Cessna 172 flight tests.

  17. Aerospace Toolbox--a flight vehicle design, analysis, simulation, and software development environment II: an in-depth overview

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.

    2002-07-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  18. dataMares - An online platform for the fast, effective dissemination of science

    NASA Astrophysics Data System (ADS)

    Johnson, A. F.; Aburto-Oropeza, O.; Moreno-Báez, M.; Giron-Nava, A.; Lopez-Sagástegui, R.; Lopez-Sagástegui, C.

    2016-02-01

    One of the current challenges in public policy development, especially related to natural resource management and conservation, is that there are very few tools that help easily identify and incorporate relevant scientific findings and data into public policy. This can also lead to a repetition of research efforts and the collect of information that in some cases might already exist. The key to addressing this challenge is to develop collaborative research tools, which can be used by different sectors of society including key stakeholder groups, managers, policy makers and the public. Here we present an "open science" platform capable of handling large data and disseminating results to a wide audience quickly. dataMares uses business intelligence software to allow the dynamic presentation of data quickly to a range of users online. dataMares provides Robust and up-to-date scientific information for decision-makers, resource managers, conservation practitioners, fishers, community members, and regional and national level decision-makers in a nutshell. It can also be used in the training of young scientists and allows quick and open connections with the journalism industry.

  19. Development of an Immunochromatography Assay (QuickNavi-Ebola) to Detect Multiple Species of Ebolaviruses.

    PubMed

    Yoshida, Reiko; Muramatsu, Shino; Akita, Hiroshi; Saito, Yuji; Kuwahara, Miwa; Kato, Daisuke; Changula, Katendi; Miyamoto, Hiroko; Kajihara, Masahiro; Manzoor, Rashid; Furuyama, Wakako; Marzi, Andrea; Feldmann, Heinz; Mweene, Aaron; Masumu, Justin; Kapeteshi, Jimmy; Muyembe-Tamfum, Jean-Jacques; Takada, Ayato

    2016-10-15

    The latest outbreak of Ebola virus disease (EVD) in West Africa has highlighted the urgent need for the development of rapid and reliable diagnostic assays. We used monoclonal antibodies specific to the ebolavirus nucleoprotein to develop an immunochromatography (IC) assay (QuickNavi-Ebola) for rapid diagnosis of EVD. The IC assay was first evaluated with tissue culture supernatants of infected Vero E6 cells and found to be capable of detecting 10 3 -10 4 focus-forming units/mL of ebolaviruses. Using serum samples from experimentally infected nonhuman primates, we confirmed that the assay could detect the viral antigen shortly after disease onset. It was also noted that multiple species of ebolaviruses could be detected by the IC assay. Owing to the simplicity of the assay procedure and absence of requirements for special equipment and training, QuickNavi-Ebola is expected to be a useful tool for rapid diagnosis of EVD. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  20. Online image classification under monotonic decision boundary constraint

    NASA Astrophysics Data System (ADS)

    Lu, Cheng; Allebach, Jan; Wagner, Jerry; Pitta, Brandi; Larson, David; Guo, Yandong

    2015-01-01

    Image classification is a prerequisite for copy quality enhancement in all-in-one (AIO) device that comprises a printer and scanner, and which can be used to scan, copy and print. Different processing pipelines are provided in an AIO printer. Each of the processing pipelines is designed specifically for one type of input image to achieve the optimal output image quality. A typical approach to this problem is to apply Support Vector Machine to classify the input image and feed it to its corresponding processing pipeline. The online training SVM can help users to improve the performance of classification as input images accumulate. At the same time, we want to make quick decision on the input image to speed up the classification which means sometimes the AIO device does not need to scan the entire image to make a final decision. These two constraints, online SVM and quick decision, raise questions regarding: 1) what features are suitable for classification; 2) how we should control the decision boundary in online SVM training. This paper will discuss the compatibility of online SVM and quick decision capability.

  1. UT/CSR analysis of earth rotation from Lageos SLR data

    NASA Technical Reports Server (NTRS)

    Tapley, B. D.; Eanes, R. J.; Schutz, B. E.

    1986-01-01

    The 1983-1984 data collected by NASA and stations participating in the Crustal Dynamics Project from satellite laser ranging (SLR) systems are used to generate solutions for the earth polar motion. Solutions obtained using the MERIT Lageos standard data set are compared to operational results based on quick-look data and generated in near real-time, and the capability of Lageos SLR for the determination of earth orientation parameters (EOP) with high temporal resolution is investigated. Finally, the sensitivity of the MERIT campaign results to the number of tracking stations and to changes in the MERIT standard model is evaluated. It is concluded that the departures from the IAU/IUGG MERIT standards do not significantly change the solution and that solutions accurate at the 2 milliarcsec level can be maintained with a network of fewer than 10 appropriately selected stations.

  2. Analysis Of AVIRIS Data From LEO-15 Using Tafkaa Atmospheric Correction

    NASA Technical Reports Server (NTRS)

    Montes, Marcos J.; Gao, Bo-Cai; Davis, Curtiss O.; Moline, Mark

    2004-01-01

    We previously developed an algorithm named Tafkaa for atmospheric correction of remote sensing ocean color data from aircraft and satellite platforms. The algorithm allows quick atmospheric correction of hyperspectral data using lookup tables generated with a modified version of Ahmad & Fraser s vector radiative transfer code. During the past few years we have extended the capabilities of the code. Current modifications include the ability to account for within scene variation in solar geometry (important for very long scenes) and view geometries (important for wide fields of view). Additionally, versions of Tafkaa have been made for a variety of multi-spectral sensors, including SeaWiFS and MODIS. In this proceeding we present some initial results of atmospheric correction of AVIRIS data from the 2001 July Hyperspectral Coastal Ocean Dynamics Experiment (HyCODE) at LEO-15.

  3. Solid polymer membrane program

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented for a solid polymer electrolyte fuel cell development program. Failure mechanism was identified and resolution of the mechanism experienced in small stack testing was demonstrated. The effect included laboratory analysis and evaluation of a matrix of configurations and operational variables for effects on the degree of hydrogen fluoride released from the cell and on the degree of blistering/delamination occurring in the reactant inlet areas of the cell and to correlate these conditions with cell life capabilities. The laboratory evaluation tests were run at conditions intended to accelerate the degradation of the solid polymer electrolyte in order to obtain relative evaluations as quick as possible. Evaluation of the resolutions for the identified failure mechanism in space shuttle configuration cell assemblies was achieved with the fabrication and life testing of two small stack buildups of four cell assemblies and eight cells each.

  4. The QDP/PLT user's guide

    NASA Technical Reports Server (NTRS)

    Tennant, Allyn F.

    1991-01-01

    PLT is a high level plotting package. A Programmer can create a default plot suited for the data being displayed. At run times, users can then interact with the plot overriding any or all of these defaults. The user is also provided the capability to fit functions to the displayed data. This ability to display, interact with, and to fit the data make PLT a useful tool in the analysis of data. The Quick and Dandy Plotter (QDP) program will read ASCII text files that contain PLT commands and data. Thus, QDP provides and easy way to use the PLT software QPD files provide a convenient way to exchange data. The QPD/PLT software is written in standard FORTRAN 77 and has been ported to VAX VMS, SUN UNIX, IBM AIX, NeXT NextStep, and MS-DOS systems.

  5. Virtual Observatory Science Applications

    NASA Technical Reports Server (NTRS)

    McGlynn, Tom

    2005-01-01

    Many Virtual-Observatory-based applications are now available to astronomers for use in their research. These span data discovery, access, visualization and analysis. Tools can quickly gather and organize information from sites around the world to help in planning a response to a gamma-ray burst, help users pick filters to isolate a desired feature, make an average template for z=2 AGN, select sources based upon information in many catalogs, or correlate massive distributed databases. Using VO protocols, the reach of existing software tools and packages can be greatly extended, allowing users to find and access remote information almost as conveniently as local data. The talk highlights just a few of the tools available to scientists, describes how both large and small scale projects can use existing tools, and previews some of the new capabilities that will be available in the next few years.

  6. The Hubble Spectroscopic Legacy Archive

    NASA Astrophysics Data System (ADS)

    Peeples, Molly S.; Tumlinson, Jason; Fox, Andrew; Aloisi, Alessandra; Ayres, Thomas R.; Danforth, Charles; Fleming, Scott W.; Jenkins, Edward B.; Jedrzejewski, Robert I.; Keeney, Brian A.; Oliveira, Cristina M.

    2016-01-01

    With no future space ultraviolet instruments currently planned, the data from the UV spectrographs aboard the Hubble Space Telescope have a legacy value beyond their initial science goals. The Hubble Spectroscopic Legacy Archive will provide to the community new science-grade combined spectra for all publicly available data obtained by the Cosmic Origins Spectrograph (COS) and the Space Telescope Imaging Spectrograph (STIS). These data will be packaged into "smart archives" according to target type and scientific themes to facilitate the construction of archival samples for common science uses. A new "quick look" capability will make the data easy for users to quickly access, assess the quality of, and download for archival science starting in Cycle 24, with the first generation of these products for the FUV modes of COS available online via MAST in early 2016.

  7. Simulating the Composite Propellant Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Williamson, Suzanne; Love, Gregory

    2000-01-01

    There is a strategic interest in understanding how the propellant manufacturing process contributes to military capabilities outside the United States. The paper will discuss how system dynamics (SD) has been applied to rapidly assess the capabilities and vulnerabilities of a specific composite propellant production complex. These facilities produce a commonly used solid propellant with military applications. The authors will explain how an SD model can be configured to match a specific production facility followed by a series of scenarios designed to analyze operational vulnerabilities. By using the simulation model to rapidly analyze operational risks, the analyst gains a better understanding of production complexities. There are several benefits of developing SD models to simulate chemical production. SD is an effective tool for characterizing complex problems, especially the production process where the cascading effect of outages quickly taxes common understanding. By programming expert knowledge into an SD application, these tools are transformed into a knowledge management resource that facilitates rapid learning without requiring years of experience in production operations. It also permits the analyst to rapidly respond to crisis situations and other time-sensitive missions. Most importantly, the quantitative understanding gained from applying the SD model lends itself to strategic analysis and planning.

  8. Interactive visual comparison of multimedia data through type-specific views

    NASA Astrophysics Data System (ADS)

    Burtner, Russ; Bohn, Shawn; Payne, Debbie

    2013-01-01

    Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, a novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment.

  9. Interactive visual comparison of multimedia data through type-specific views

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burtner, Edwin R.; Bohn, Shawn J.; Payne, Deborah A.

    2013-02-05

    Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, amore » novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment. Keywords: Multimedia (Image/Video/Music) Visualization.« less

  10. Development of 3-Year Roadmap to Transform the Discipline of Systems Engineering

    DTIC Science & Technology

    2010-03-31

    quickly humans could physically construct them. Indeed, magnetic core memory was entirely constructed by human hands until it was superseded by...For their mainframe computers, IBM develops the applications, operating system, computer hardware and microprocessors (off the shelf standard memory ...processor developers work on potential computational and memory pipelines to support the required performance capabilities and use the available transistors

  11. Defence Industrial Strategy

    DTIC Science & Technology

    2005-12-01

    for early clarity, we needed to act quickly. There are three levels to this strategy :  promoting an overall business environment which is attractive...and that the level of influence and attractiveness of MOD business varies by sector and by type of company. But the UK provides a unique environment...defence business environment in a particular country, and at the specific level , to achieve defined outcomes in particular capability or technology

  12. A Randomized Rounding Approach for Optimization of Test Sheet Composing and Exposure Rate Control in Computer-Assisted Testing

    ERIC Educational Resources Information Center

    Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han

    2012-01-01

    Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…

  13. A COMPARISON OF AIRFLOW PATTERNS FROM THE QUIC MODEL AND AN ATMOSPHERIC WIND TUNNEL FOR A TWO-DIMENSIONAL BUILDING ARRAY AND A MULTI-CITY BLOCK REGION NEAR THE WORLD TRADE CENTER SITE

    EPA Science Inventory

    Dispersion of pollutants in densely populated urban areas is a research area of clear importance. Currently, few numerical tools exist capable of describing airflow and dispersion patterns in these complex regions in a time efficient manner. (QUIC), Quick Urban & Industrial C...

  14. Modular System to Enable Extravehicular Activity

    NASA Technical Reports Server (NTRS)

    Sargusingh, Miriam J.

    2011-01-01

    The ability to perform extravehicular activity (EVA), both human and robotic, has been identified as a key component to space missions to support such operations as assembly and maintenance of space system (e.g. construction and maintenance of the International Space Station), and unscheduled activities to repair an element of the transportation and habitation systems that can only be accessed externally and via unpressurized areas. In order to make human transportation beyond lower earth orbit (BLEO) practical, efficiencies must be incorporated into the integrated transportation systems to reduce system mass and operational complexity. Affordability is also a key aspect to be considered in space system development; this could be achieved through commonality, modularity and component reuse. Another key aspect identified for the EVA system was the ability to produce flight worthy hardware quickly to support early missions and near Earth technology demonstrations. This paper details a conceptual architecture for a modular extravehicular activity system (MEVAS) that would meet these stated needs for EVA capability that is affordable, and that could be produced relatively quickly. Operational concepts were developed to elaborate on the defined needs and define the key capabilities, operational and design constraints, and general timelines. The operational concept lead to a high level design concept for a module that interfaces with various space transportation elements and contains the hardware and systems required to support human and telerobotic EVA; the module would not be self-propelled and would rely on an interfacing element for consumable resources. The conceptual architecture was then compared to EVA Systems used in the Shuttle Orbiter, on the International Space Station to develop high level design concepts that incorporate opportunities for cost savings through hardware reuse, and quick production through the use of existing technologies and hardware designs. An upgrade option was included to make use of the developing suitport technologies.

  15. Field comparison of OraQuick® ADVANCE Rapid HIV-1/2 antibody test and two blood-based rapid HIV antibody tests in Zambia

    PubMed Central

    2012-01-01

    Background Zambia’s national HIV testing algorithm specifies use of two rapid blood based antibody assays, Determine®HIV-1/2 (Inverness Medical) and if positive then Uni-GoldTM Recombigen HIV-1/2 (Trinity Biotech). Little is known about the performance of oral fluid based HIV testing in Zambia. The aims of this study are two-fold: 1) to compare the diagnostic accuracy (sensitivity and specificity) under field conditions of the OraQuick® ADVANCE® Rapid HIV-1/2 (OraSure Technologies, Inc.) to two blood-based rapid antibody tests currently in use in the Zambia National Algorithm, and 2) to perform a cost analysis of large-scale field testing employing the OraQuick®. Methods This was a operational retrospective research of HIV testing and questionnaire data collected in 2010 as part of the ZAMSTAR (Zambia South Africa TB and AIDS reduction) study. Randomly sampled individuals in twelve communities were tested consecutively with OraQuick® test using oral fluid versus two blood-based rapid HIV tests, Determine® and Uni-GoldTM. A cost analysis of four algorithms from health systems perspective were performed: 1) Determine® and if positive, then Uni-GoldTM (Determine®/Uni-GoldTM); based on current algorithm, 2) Determine® and if positive, then OraQuick® (Determine®/OraQuick®), 3) OraQuick® and if positive, then Determine® (OraQuick®/Determine®), 4) OraQuick® and if positive, then Uni-GoldTM (OraQuick®/Uni-GoldTM). This information was then used to construct a model using a hypothetical population of 5,000 persons with varying prevalence of HIV infection from 1–30%. Results 4,458 participants received both a Determine® and OraQuick® test. The sensitivity and specificity of the OraQuick® test were 98.7 (95%CI, 97.5–99.4) and 99.8 (95%CI, 99.6–99.9), respectively when compared to HIV positive serostatus. The average unit costs per algorithm were US$3.76, US$4.03, US$7.35, and US$7.67 for Determine®/Uni-GoldTM, Determine®/OraQuick®, OraQuick®/Determine®, and OraQuick®/Uni-GoldTM, respectively, for an HIV prevalence of 15%. Conclusions An alternative HIV testing algorithm could include OraQuick® test which had a high sensitivity and specificity. The current Determine®/Uni-GoldTM testing algorithm is the least expensive when compared to Determine®/OraQuick®, OraQuick®/Determine®, and OraQuick®/Uni-GoldTM in the Zambian setting. From our field experience, oral fluid based testing offers many advantages over blood-based testing, especially with self testing on the horizon. PMID:22871032

  16. HiRel - Reliability/availability integrated workstation tool

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Dugan, Joanne B.

    1992-01-01

    The HiRel software tool is described and demonstrated by application to the mission avionics subsystem of the Advanced System Integration Demonstrations (ASID) system that utilizes the PAVE PILLAR approach. HiRel marks another accomplishment toward the goal of producing a totally integrated computer-aided design (CAD) workstation design capability. Since a reliability engineer generally represents a reliability model graphically before it can be solved, the use of a graphical input description language increases productivity and decreases the incidence of error. The graphical postprocessor module HARPO makes it possible for reliability engineers to quickly analyze huge amounts of reliability/availability data to observe trends due to exploratory design changes. The addition of several powerful HARP modeling engines provides the user with a reliability/availability modeling capability for a wide range of system applications all integrated under a common interactive graphical input-output capability.

  17. An engineering and economic evaluation of quick germ-quick fiber process for dry-grind ethanol facilities: analysis.

    PubMed

    Rodríguez, Luis F; Li, Changying; Khanna, Madhu; Spaulding, Aslihan D; Lin, Tao; Eckhoff, Steven R

    2010-07-01

    An engineering economic model, which is mass balanced and compositionally driven, was developed to compare the conventional corn dry-grind process and the pre-fractionation process called quick germ-quick fiber (QQ). In this model, documented in a companion article, the distillers dried grains with solubles (DDGS) price was linked with its protein and fiber content as well as with the long-term average relationship with the corn price. The detailed economic analysis showed that the QQ plant retrofitted from conventional dry-grind ethanol plant reduces the manufacturing cost of ethanol by 13.5 cent/gallon and has net present value of nearly $4 million greater than the conventional dry-grind plant at an interest rate of 4% in 15years. Ethanol and feedstock price sensitivity analysis showed that the QQ plant gains more profits when ethanol price increases than conventional dry-grind ethanol plant. An optimistic analysis of the QQ process suggests that the greater value of the modified DDGS would provide greater resistance to fluctuations in corn price for QQ facilities. This model can be used to provide decision support for ethanol producers. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  18. Influence of eating quickly and eating until full on anthropometric gains in girls: A population-based, longitudinal study.

    PubMed

    Ochiai, H; Shirasawa, T; Nanri, H; Nishimura, R; Hoshino, H; Kokaze, A

    2017-11-01

    In examining childhood overweight/obesity, there is a need to consider both eating quickly and eating until full. This longitudinal study investigated the influence of eating quickly and/or eating until full on anthropometric variables and becoming overweight/obese among Japanese schoolgirls. Study participants were fourth-grade schoolgirls (aged 9 or 10 years) in Ina Town, Japan. Physical examinations and a questionnaire survey were performed at baseline (fourth grade) and after 3 years (seventh grade). Height, weight, and waist circumference were measured in the physical examinations, while the data on eating quickly and eating until full were collected in the questionnaire survey. Analysis of variance and analysis of covariance were used to compare the differences in each anthropometric variable between fourth and seventh grade among groups. Data on 425 non-overweight/obese schoolgirls in fourth grade were analyzed. Gains in anthropometric variables (body mass index, waist circumference, and waist-to-height ratio) from fourth to seventh grade were significantly larger in the "eating quickly and eating until full" group than in the "not eating quickly and not eating until full" group. In contrast, there were no significant differences in the gains between the "eating quickly or eating until full" group and the "not eating quickly and not eating until full" group. The proportion of overweight/obese girls in seventh grade was higher in the "eating quickly and eating until full" group than in the other groups. Eating quickly and eating until full had a substantial impact on excess gains in anthropometric variables among schoolgirls, suggesting that modifying these eating behaviors may help prevent non-overweight/obese girls from the excess gains. Accordingly, school health programs need to focus on not eating quickly and/or not eating until full to prevent overweight/obesity; it is necessary to emphasize "the risk of overweight/obesity associated with these eating behaviors" in schools. © 2017 The Authors. Child: Care, Health and Development Published by John Wiley & Sons Ltd.

  19. [Better performance of Western blotting: quick vs slow protein transfer, blotting membranes and the visualization methods].

    PubMed

    Kong, Ling-Quan; Pu, Ying-Hui; Ma, Shi-Kun

    2008-01-01

    To study how the choices of the quick vs slow protein transfer, the blotting membranes and the visualization methods influence the performance of Western blotting. The cellular proteins were abstracted from human breast cell line MDA-MB-231 for analysis with Western blotting using quick (2 h) and slow (overnight) protein transfer, different blotting membranes (nitrocellulose, PVDF and nylon membranes) and different visualization methods (ECL and DAB). In Western blotting with slow and quick protein transfer, the prestained marker presented more distinct bands on nitrocellulose membrane than on the nylon and PVDF membranes, and the latter also showed clear bands on the back of the membrane to very likely cause confusion, which did not occur with nitrocellulose membrane. PVDF membrane allowed slightly clearer visualization of the proteins with DAB method as compared with nitrocellulose and nylon membranes, and on the latter two membranes, quick protein transfer was likely to result in somehow irregular bands in comparison with slow protein transfer. With slow protein transfer and chemiluminescence for visualization, all the 3 membranes showed clear background, while with quick protein transfer, nylon membrane gave rise to obvious background noise but the other two membranes did not. Different membranes should be selected for immunoblotting according to the actual needs of the experiment. Slow transfer of the proteins onto the membranes often has better effect than quick transfer, and enhanced chemiluminescence is superior to DAB for protein visualization and allows highly specific and sensitive analysis of the protein expressions.

  20. The need for high-quality whole-genome sequence databases in microbial forensics.

    PubMed

    Sjödin, Andreas; Broman, Tina; Melefors, Öjar; Andersson, Gunnar; Rasmusson, Birgitta; Knutsson, Rickard; Forsman, Mats

    2013-09-01

    Microbial forensics is an important part of a strengthened capability to respond to biocrime and bioterrorism incidents to aid in the complex task of distinguishing between natural outbreaks and deliberate acts. The goal of a microbial forensic investigation is to identify and criminally prosecute those responsible for a biological attack, and it involves a detailed analysis of the weapon--that is, the pathogen. The recent development of next-generation sequencing (NGS) technologies has greatly increased the resolution that can be achieved in microbial forensic analyses. It is now possible to identify, quickly and in an unbiased manner, previously undetectable genome differences between closely related isolates. This development is particularly relevant for the most deadly bacterial diseases that are caused by bacterial lineages with extremely low levels of genetic diversity. Whole-genome analysis of pathogens is envisaged to be increasingly essential for this purpose. In a microbial forensic context, whole-genome sequence analysis is the ultimate method for strain comparisons as it is informative during identification, characterization, and attribution--all 3 major stages of the investigation--and at all levels of microbial strain identity resolution (ie, it resolves the full spectrum from family to isolate). Given these capabilities, one bottleneck in microbial forensics investigations is the availability of high-quality reference databases of bacterial whole-genome sequences. To be of high quality, databases need to be curated and accurate in terms of sequences, metadata, and genetic diversity coverage. The development of whole-genome sequence databases will be instrumental in successfully tracing pathogens in the future.

  1. INCA- INTERACTIVE CONTROLS ANALYSIS

    NASA Technical Reports Server (NTRS)

    Bauer, F. H.

    1994-01-01

    The Interactive Controls Analysis (INCA) program was developed to provide a user friendly environment for the design and analysis of linear control systems, primarily feedback control systems. INCA is designed for use with both small and large order systems. Using the interactive graphics capability, the INCA user can quickly plot a root locus, frequency response, or time response of either a continuous time system or a sampled data system. The system configuration and parameters can be easily changed, allowing the INCA user to design compensation networks and perform sensitivity analysis in a very convenient manner. A journal file capability is included. This stores an entire sequence of commands, generated during an INCA session into a file which can be accessed later. Also included in INCA are a context-sensitive help library, a screen editor, and plot windows. INCA is robust to VAX-specific overflow problems. The transfer function is the basic unit of INCA. Transfer functions are automatically saved and are available to the INCA user at any time. A powerful, user friendly transfer function manipulation and editing capability is built into the INCA program. The user can do all transfer function manipulations and plotting without leaving INCA, although provisions are made to input transfer functions from data files. By using a small set of commands, the user may compute and edit transfer functions, and then examine these functions by using the ROOT_LOCUS, FREQUENCY_RESPONSE, and TIME_RESPONSE capabilities. Basic input data, including gains, are handled as single-input single-output transfer functions. These functions can be developed using the function editor or by using FORTRAN- like arithmetic expressions. In addition to the arithmetic functions, special functions are available to 1) compute step, ramp, and sinusoid functions, 2) compute closed loop transfer functions, 3) convert from S plane to Z plane with optional advanced Z transform, and 4) convert from Z plane to W plane and back. These capabilities allow the INCA user to perform block diagram algebraic manipulations quickly for functions in the S, Z, and W domains. Additionally, a versatile digital control capability has been included in INCA. Special plane transformations allow the user to easily convert functions from one domain to another. Other digital control capabilities include: 1) totally independent open loop frequency response analyses on a continuous plant, discrete control system with a delay, 2) advanced Z-transform capability for systems with delays, and 3) multirate sampling analyses. The current version of INCA includes Dynamic Functions (which change when a parameter changes), standard filter generation, PD and PID controller generation, incorporation of the QZ-algorithm (function addition, inverse Laplace), and describing functions that allow the user to calculate the gain and phase characteristics of a nonlinear device. The INCA graphic modes provide the user with a convenient means to document and study frequency response, time response, and root locus analyses. General graphics features include: 1) zooming and dezooming, 2) plot documentation, 3) a table of analytic computation results, 4) multiple curves on the same plot, and 5) displaying frequency and gain information for a specific point on a curve. Additional capabilities in the frequency response mode include: 1) a full complement of graphical methods Bode magnitude, Bode phase, Bode combined magnitude and phase, Bode strip plots, root contour plots, Nyquist, Nichols, and Popov plots; 2) user selected plot scaling; and 3) gain and phase margin calculation and display. In the time response mode, additional capabilities include: 1) support for inverse Laplace and inverse Z transforms, 2) support for various input functions, 3) closed loop response evaluation, 4) loop gain sensitivity analyses, 5) intersample time response for discrete systems using the advanced Z transform, and 6) closed loop time response using mixed plane (S, Z, W) operations with delay. A Graphics mode command was added to the current version of INCA, version 3.13, to produce Metafiles (graphic files) of the currently displayed plot. The metafile can be displayed and edited using the QPLOT Graphics Editor and Replotter for Metafiles (GERM) program included with the INCA package. The INCA program is written in Pascal and FORTRAN for interactive or batch execution and has been implemented on a DEC VAX series computer under VMS. Both source code and executable code are supplied for INCA. Full INCA graphics capabilities are supported for various Tektronix 40xx and 41xx terminals; DEC VT graphics terminals; many PC and Macintosh terminal emulators; TEK014 hardcopy devices such as the LN03 Laserprinter; and bit map graphics external hardcopy devices. Also included for the TEK4510 rasterizer users are a multiple copy feature, a wide line feature, and additional graphics fonts. The INCA program was developed in 1985, Version 2.04 was released in 1986, Version 3.00 was released in 1988, and Version 3.13 was released in 1989. An INCA version 2.0X conversion program is included.

  2. The EBI search engine: EBI search as a service—making biological data accessible for all

    PubMed Central

    Park, Young M.; Squizzato, Silvano; Buso, Nicola; Gur, Tamer

    2017-01-01

    Abstract We present an update of the EBI Search engine, an easy-to-use fast text search and indexing system with powerful data navigation and retrieval capabilities. The interconnectivity that exists between data resources at EMBL–EBI provides easy, quick and precise navigation and a better understanding of the relationship between different data types that include nucleotide and protein sequences, genes, gene products, proteins, protein domains, protein families, enzymes and macromolecular structures, as well as the life science literature. EBI Search provides a powerful RESTful API that enables its integration into third-party portals, thus providing ‘Search as a Service’ capabilities, which are the main topic of this article. PMID:28472374

  3. Technical Study on Improvement of Endurance Capability of Limit Short-circuit Current of Charge Control SMART Meter

    NASA Astrophysics Data System (ADS)

    Li, W. W.; Du, Z. Z.; Yuan, R. m.; Xiong, D. Z.; Shi, E. W.; Lu, G. N.; Dai, Z. Y.; Chen, X. Q.; Jiang, Z. Y.; Lv, Y. G.

    2017-10-01

    Smart meter represents the development direction of energy-saving smart grid in the future. The load switch, one of the core parts of smart meter, should be of high reliability, safety and endurance capability of limit short-circuit current. For this reason, this paper discusses the quick simulation of relationship between attraction and counterforce of load switch without iteration, establishes dual response surface model of attraction and counterforce and optimizes the design scheme of load switch for charge control smart meter, thus increasing electromagnetic attraction and spring counterforce. In this way, this paper puts forward a method to improve the withstand capacity of limit short-circuit current.

  4. Advancing NASA's Satellite Control Capabilities: More than Just Better Technology

    NASA Technical Reports Server (NTRS)

    Smith, Danford

    2008-01-01

    This viewgraph presentation reviews the work of the Goddard Mission Services Evolution Center (GMSEC) in the development of the NASA's satellite control capabilities. The purpose of the presentation is to provide a quick overview of NASA's Goddard Space Flight Center and our approach to coordinating the ground system resources and development activities across many different missions. NASA Goddard's work in developing and managing the current and future space exploration missions is highlighted. The GMSEC, was established to to coordinate ground and flight data systems development and services, to create a new standard ground system for many missions and to reflect the reality that business reengineering and mindset were just as important.

  5. The environment workbench: A design tool for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Rankin, Thomas V.; Wilcox, Katherine G.; Roche, James C.

    1991-01-01

    The environment workbench (EWB) is being developed for NASA by S-CUBED to provide a standard tool that can be used by the Space Station Freedom (SSF) design and user community for requirements verification. The desktop tool will predict and analyze the interactions of SSF with its natural and self-generated environments. A brief review of the EWB design and capabilities is presented. Calculations using a prototype EWB of the on-orbit floating potentials and contaminant environment of SSF are also presented. Both the positive and negative grounding configurations for the solar arrays are examined to demonstrate the capability of the EWB to provide quick estimates of environments, interactions, and system effects.

  6. China’s Near Seas Combat Capabilities (China Maritime Study, Number 11)

    DTIC Science & Technology

    2014-02-01

    Chinese writings stress preemptive attacks on key U.S. power-projection capabili- ties—including aircraft carriers—prior to or quickly following formal...attack craft consistently stress covert, long-range attacks taking advantage of stealth, surprise, and standoff ranges. A final factor that supports...craft tenders, but this is speculative.56 The 2010 Chinese defense white paper does stress , however, that sea-based china’s near seas combat

  7. The LHC Experiments

    ScienceCinema

    Lincoln, Don

    2018-01-16

    The Large Hadron Collider or LHC is the world’s biggest particle accelerator, but it can only get particles moving very quickly. To make measurements, scientists must employ particle detectors. There are four big detectors at the LHC: ALICE, ATLAS, CMS, and LHCb. In this video, Fermilab’s Dr. Don Lincoln introduces us to these detectors and gives us an idea of each one’s capabilities.

  8. Military Alliances and Coalitions: Going to War without France

    DTIC Science & Technology

    2008-03-26

    to drive Saddam Hussein’s army from Kuwait, the formal alliance language simply did not exist. The 9/11 attacks highlighted the limitations of static... language does not exist. They have been credited with quickly building purposeful and capable military forces beyond traditional structured alliance... labeled unilateralist for the mostly-American strike against the Taliban in Afghanistan in 2001 and the 2003 regime change in Iraq. 10 The United

  9. Defense Acquisition Research Journal. Volume 21, Number 2, Issue 69

    DTIC Science & Technology

    2014-04-01

    that quickly meets their needs, not a slow and lumbering bureau- cracy better suited to the last century. As important, our military men and women...resolution of urgent needs/ONS. Joint organizations and other military services, however, are not included in this table. As reflected in Table 2, multiple...urgent capability shortfall, the process endures. Materiel release is required for all nonexpendable materiel; high-density military expendables

  10. 6th Annual National Small Business Conference

    DTIC Science & Technology

    2009-06-03

    Extension Partnership – MIT Lean Advancement Initiative – Customers • Lean Tools – Value Stream Mapping – Kaizen Events Center for Management...Blue denotes kaizen events Most suppliers did not have in-house lean capability therefore the OEM and customer facilitated the events 36 Center for...Management & Economic Research 37 Kaizen Events • Kaizen is the process of: – Identifying & eliminating waste – as quickly as possible – at the

  11. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2012-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less

  12. ImageX: new and improved image explorer for astronomical images and beyond

    NASA Astrophysics Data System (ADS)

    Hayashi, Soichi; Gopu, Arvind; Kotulla, Ralf; Young, Michael D.

    2016-08-01

    The One Degree Imager - Portal, Pipeline, and Archive (ODI-PPA) has included the Image Explorer interactive image visualization tool since it went operational. Portal users were able to quickly open up several ODI images within any HTML5 capable web browser, adjust the scaling, apply color maps, and perform other basic image visualization steps typically done on a desktop client like DS9. However, the original design of the Image Explorer required lossless PNG tiles to be generated and stored for all raw and reduced ODI images thereby taking up tens of TB of spinning disk space even though a small fraction of those images were being accessed by portal users at any given time. It also caused significant overhead on the portal web application and the Apache webserver used by ODI-PPA. We found it hard to merge in improvements made to a similar deployment in another project's portal. To address these concerns, we re-architected Image Explorer from scratch and came up with ImageX, a set of microservices that are part of the IU Trident project software suite, with rapid interactive visualization capabilities useful for ODI data and beyond. We generate a full resolution JPEG image for each raw and reduced ODI FITS image before producing a JPG tileset, one that can be rendered using the ImageX frontend code at various locations as appropriate within a web portal (for example: on tabular image listings, views allowing quick perusal of a set of thumbnails or other image sifting activities). The new design has decreased spinning disk requirements, uses AngularJS for the client side Model/View code (instead of depending on backend PHP Model/View/Controller code previously used), OpenSeaDragon to render the tile images, and uses nginx and a lightweight NodeJS application to serve tile images thereby significantly decreasing the Time To First Byte latency by a few orders of magnitude. We plan to extend ImageX for non-FITS images including electron microscopy and radiology scan images, and its featureset to include basic functions like image overlay and colormaps. Users needing more advanced visualization and analysis capabilities could use a desktop tool like DS9+IRAF on another IU Trident project called StarDock, without having to download Gigabytes of FITS image data.

  13. Assessment of IRI-2012, NeQuick-2 and IRI-Plas 2015 models with observed equatorial ionization anomaly in Africa during 2009 sudden stratospheric warming event

    NASA Astrophysics Data System (ADS)

    Bolaji, O. S.; Oyeyemi, E. O.; Adewale, A. O.; Wu, Q.; Okoh, D.; Doherty, P. H.; Kaka, R. O.; Abbas, M.; Owolabi, C.; Jidele, P. A.

    2017-11-01

    In Africa, we assessed the performance of all the three options of International Reference Ionosphere 2012, IRI-2012 (i.e. IRI-2001, IRI-2001COR and IRI-NeQuick), NeQuick-2 and IRI-Plas 2015 models prior to and during 2009 sudden stratospheric warming (SSW) event to predict equatorial ionization anomaly (EIA) crest locations and their magnitudes using total electron content (TEC) from experimental records of Global Positioning System (GPS). We confirmed that the IRI-Plas 2015 that appeared as the best compared to all of the models as regard prediction of the EIA crest locations in the northern hemisphere of Africa is due to discontinuities in the GPS data between ∼8° N and 22° N. As regard the predictions of EIA crest magnitudes and the location of EIA crests in the southern hemisphere of Africa, they are not present in all the models. The NeQuick-2 model does not have the capability to predict either the EIA crest location in the northern or southern hemisphere. The SSW effect on the low latitude was able to modify a single EIA crest to pre-noon and post noon EIA crests in the northern hemisphere during the SSW peak phase and significantly reduced the GPS TEC magnitudes over the hemispheres as well. These SSW effects and delays of plasma transportation to higher latitudes in GPS TEC were absent in all the models. For future improvements of IRI-2012, NeQuick-2 and IRI-Plas 2015 models, SSW conditions should be included in order to characterize the effect of lower atmosphere on the ionosphere. The EIA trough modeling is only present in IRI-2001COR and IRI-2001NeQuick options. In the middle latitude, all the model could not predict the location of highest TEC magnitudes found at RBAY (Richardsbay, South Africa).

  14. VIPER: Visualization Pipeline for RNA-seq, a Snakemake workflow for efficient and complete RNA-seq analysis.

    PubMed

    Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W

    2018-04-12

    RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.

  15. Hardening surveillance illumination using aircraft antennas

    NASA Astrophysics Data System (ADS)

    Donohoe, J. P.; Taylor, C. D.

    1990-06-01

    Aircraft maintenance depots and main operating bases need to be able to perform quick checks of the electromagnetic pulse (EMP) hardness of their systems without removing them from service for any length of time. Preliminary tests have shown that the onboard HF antennas of the EMP Test-Bed Aircraft (EMPTAC) may be capable of providing the HF excitation required to effectively monitor the EMP hardness of aircraft systems. The surface current and charge distributions on the EMPTAC which result from swept frequency excitation of the HF radio antennas are computed over a range of 0.5 to 100 MHz using various antenna drive configurations. The computational analysis is performed by using two separate frequency-dependent techniques: the method-of-moments technique and the physical optics approximation. These calculations are then compared with the excitation provided from an overhead plane wave and with measured data from EMPTAC tests.

  16. Simulation studies of plasma waves in the electron foreshock - The generation of downshifted oscillations

    NASA Technical Reports Server (NTRS)

    Dum, C. T.

    1990-01-01

    The generation of waves with frequencies downshifted from the plasma frequency, as observed in the electron foreshock, is analyzed by particle simulation. Wave excitation differs fundamentally from the familiar excitation of the plasma eigenmodes by a gentle bump-on-tail electron distribution. Beam modes are destabilized by resonant interaction with bulk electrons, provided the beam velocity spread is very small. These modes are stabilized, starting with the higher frequencies, as the beam is broadened and slowed down by the interaction with the wave spectrum. Initially a very cold beam is also capable of exciting frequencies considerably above the plasma frequency, but such oscillations are quickly stabilized. Low-frequency modes persist for a long time, until the bump in the electron distribution is completely 'ironed' out. This diffusion process also is quite different from the familiar case of well-separated beam and bulk electrons. A quantitative analysis of these processes is carried out.

  17. Development of Moire machine vision

    NASA Technical Reports Server (NTRS)

    Harding, Kevin G.

    1987-01-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  18. Microcontroller-based real-time QRS detection.

    PubMed

    Sun, Y; Suppappola, S; Wrublewski, T A

    1992-01-01

    The authors describe the design of a system for real-time detection of QRS complexes in the electrocardiogram based on a single-chip microcontroller (Motorola 68HC811). A systematic analysis of the instrumentation requirements for QRS detection and of the various design techniques is also given. Detection algorithms using different nonlinear transforms for the enhancement of QRS complexes are evaluated by using the ECG database of the American Heart Association. The results show that the nonlinear transform involving multiplication of three adjacent, sign-consistent differences in the time domain gives a good performance and a quick response. When implemented with an appropriate sampling rate, this algorithm is also capable of rejecting pacemaker spikes. The eight-bit single-chip microcontroller provides sufficient throughput and shows a satisfactory performance. Implementation of multiple detection algorithms in the same system improves flexibility and reliability. The low chip count in the design also favors maintainability and cost-effectiveness.

  19. Cascade Distillation System Design for Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Sargusingh, Miriam J.; Callahan, Michael R.

    2015-01-01

    Per the NASA Human Health, Life Support and Habitation System Technology Area 06 report "crewed missions venturing beyond Low-Earth Orbit (LEO) will require technologies with improved reliability, reduced mass, self-sufficiency, and minimal logistical needs as an emergency or quick-return option will not be feasible." To meet this need, the development team of the second generation Cascade Distillation System (CDS 2.0) opted a development approach that explicitely incorporate consideration of safety, mission assurance, and autonomy. The CDS 2.0 prelimnary design focused on establishing a functional baseline that meets the CDS core capabilities and performance. The critical design phase is now focused on incorporating features through a deliberative process of establishing the systems failure modes and effects, identifying mitigative strategies, and evaluating the merit of the proposed actions through analysis and test. This paper details results of this effort on the CDS 2.0 design.

  20. Reprint of: Combining theory and experiment for X-ray absorption spectroscopy and resonant X-ray scattering characterization of polymers

    DOE PAGES

    Su, Gregory M.; Cordova, Isvar A.; Brady, Michael A.; ...

    2016-11-01

    An improved understanding of fundamental chemistry, electronic structure, morphology, and dynamics in polymers and soft materials requires advanced characterization techniques that are amenable to in situ and operando studies. Soft X-ray methods are especially useful in their ability to non-destructively provide information on specific materials or chemical moieties. Analysis of these experiments, which can be very dependent on X-ray energy and polarization, can quickly become complex. Complementary modeling and predictive capabilities are required to properly probe these critical features. Here in this paper, we present relevant background on this emerging suite of techniques. We focus on how the combination ofmore » theory and experiment has been applied and can be further developed to drive our understanding of how these methods probe relevant chemistry, structure, and dynamics in soft materials.« less

  1. Combining theory and experiment for X-ray absorption spectroscopy and resonant X-ray scattering characterization of polymers

    DOE PAGES

    Su, Gregory M.; Cordova, Isvar A.; Brady, Michael A.; ...

    2016-07-04

    We present that an improved understanding of fundamental chemistry, electronic structure, morphology, and dynamics in polymers and soft materials requires advanced characterization techniques that are amenable to in situ and operando studies. Soft X-ray methods are especially useful in their ability to non-destructively provide information on specific materials or chemical moieties. Analysis of these experiments, which can be very dependent on X-ray energy and polarization, can quickly become complex. Complementary modeling and predictive capabilities are required to properly probe these critical features. Here, we present relevant background on this emerging suite of techniques. Finally, we focus on how the combinationmore » of theory and experiment has been applied and can be further developed to drive our understanding of how these methods probe relevant chemistry, structure, and dynamics in soft materials.« less

  2. Cascade Distillation System Design for Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Sarguisingh, Miriam; Callahan, Michael R.; Okon, Shira

    2015-01-01

    Per the NASA Human Health, Life Support and Habitation System Technology Area 06 report "crewed missions venturing beyond Low-Earth Orbit (LEO) will require technologies with improved reliability, reduced mass, self-sufficiency, and minimal logistical needs as an emergency or quick-return option will not be feasible".1 To meet this need, the development team of the second generation Cascade Distillation System (CDS 2.0) chose a development approach that explicitly incorporate consideration of safety, mission assurance, and autonomy. The CDS 2.0 preliminary design focused on establishing a functional baseline that meets the CDS core capabilities and performance. The critical design phase is now focused on incorporating features through a deliberative process of establishing the systems failure modes and effects, identifying mitigation strategies, and evaluating the merit of the proposed actions through analysis and test. This paper details results of this effort on the CDS 2.0 design.

  3. Development of Moire machine vision

    NASA Astrophysics Data System (ADS)

    Harding, Kevin G.

    1987-10-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  4. Massive problem reports mining and analysis based parallelism for similar search

    NASA Astrophysics Data System (ADS)

    Zhou, Ya; Hu, Cailin; Xiong, Han; Wei, Xiafei; Li, Ling

    2017-05-01

    Massive problem reports and solutions accumulated over time and continuously collected in XML Spreadsheet (XMLSS) format from enterprises and organizations, which record a series of comprehensive description about problems that can help technicians to trace problems and their solutions. It's a significant and challenging issue to effectively manage and analyze these massive semi-structured data to provide similar problem solutions, decisions of immediate problem and assisting product optimization for users during hardware and software maintenance. For this purpose, we build a data management system to manage, mine and analyze these data search results that can be categorized and organized into several categories for users to quickly find out where their interesting results locate. Experiment results demonstrate that this system is better than traditional centralized management system on the performance and the adaptive capability of heterogeneous data greatly. Besides, because of re-extracting topics, it enables each cluster to be described more precise and reasonable.

  5. THE ROLE OF THE CONSEQUENCE MANAGEMENT HOME TEAM IN THE FUKUSHIMA DAIICHI RESPONSE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pemberton, Wendy; Mena, RaJah; Beal, William

    The Consequence Management Home Team is a U.S. Department of Energy/National Nuclear Security Administration asset. It assists a variety of response organizations with modeling; radiological operations planning; field monitoring techniques; and the analysis, interpretation, and distribution of radiological data. These reach-back capabilities are activated quickly to support public safety and minimize the social and economic impact of a nuclear or radiological incident. In the Fukushima Daiichi response, the Consequence Management Home Team grew to include a more broad range of support than was historically planned. From the early days of the response to the continuing involvement in supporting late phasemore » efforts, each stage of the Consequence Management Home Team support had distinct characteristics in terms of management of incoming data streams as well as creation of products. Regardless of stage, the Consequence Management Home Team played a critical role in the Fukushima Daiichi response effort.« less

  6. LKB1 promotes cell survival by modulating TIF-IA-mediated pre-ribosomal RNA synthesis under uridine downregulated conditions

    PubMed Central

    Liu, Xiuju; Huang, Henry; Wilkinson, Scott C.; Zhong, Diansheng; Khuri, Fadlo R.; Fu, Haian; Marcus, Adam; He, Yulong; Zhou, Wei

    2016-01-01

    We analyzed the mechanism underlying 5-aminoimidazole-4-carboxamide riboside (AICAR) mediated apoptosis in LKB1-null non-small cell lung cancer (NSCLC) cells. Metabolic profile analysis revealed depletion of the intracellular pyrimidine pool after AICAR treatment, but uridine was the only nucleotide precursor capable of rescuing this apoptosis, suggesting the involvement of RNA metabolism. Because half of RNA transcription in cancer is for pre-ribosomal RNA (rRNA) synthesis, which is suppressed by over 90% after AICAR treatment, we evaluated the role of TIF-IA-mediated rRNA synthesis. While the depletion of TIF-IA by RNAi alone promoted apoptosis in LKB1-null cells, the overexpression of a wild-type or a S636A TIF-IA mutant, but not a S636D mutant, attenuated AICAR-induced apoptosis. In LKB1-null H157 cells, pre-rRNA synthesis was not suppressed by AICAR when wild-type LKB1 was present, and cellular fractionation analysis indicated that TIF-IA quickly accumulated in the nucleus in the presence of a wild-type LKB1 but not a kinase-dead mutant. Furthermore, ectopic expression of LKB1 was capable of attenuating AICAR-induced death in AMPK-null cells. Because LKB1 promotes cell survival by modulating TIF-IA-mediated pre-rRNA synthesis, this discovery suggested that targeted depletion of uridine related metabolites may be exploited in the clinic to eliminate LKB1-null cancer cells. PMID:26506235

  7. Environmental impact analysis with the airspace concept evaluation system

    NASA Technical Reports Server (NTRS)

    Augustine, Stephen; Capozzi, Brian; DiFelici, John; Graham, Michael; Thompson, Terry; Miraflor, Raymond M. C.

    2005-01-01

    The National Aeronautics and Space Administration (NASA) Ames Research Center has developed the Airspace Concept Evaluation System (ACES), which is a fast-time simulation tool for evaluating Air Traffic Management (ATM) systems. This paper describes linking a capability to ACES which can analyze the environmental impact of proposed future ATM systems. This provides the ability to quickly evaluate metrics associated with environmental impacts of aviation for inclusion in multi-dimensional cost-benefit analysis of concepts for evolution of the National Airspace System (NAS) over the next several decades. The methodology used here may be summarized as follows: 1) Standard Federal Aviation Administration (FAA) noise and emissions-inventory models, the Noise Impact Routing System (NIRS) and the Emissions and Dispersion Modeling System (EDMS), respectively, are linked to ACES simulation outputs; 2) appropriate modifications are made to ACES outputs to incorporate all information needed by the environmental models (e.g., specific airframe and engine data); 3) noise and emissions calculations are performed for all traffic and airports in the study area for each of several scenarios, as simulated by ACES; and 4) impacts of future scenarios are compared to the current NAS baseline scenario. This paper also provides the results of initial end-to-end, proof-of-concept runs of the integrated ACES and environmental-modeling capability. These preliminary results demonstrate that if no growth is likely to be impeded by significant environmental impacts that could negatively affect communities throughout the nation.

  8. LKB1 promotes cell survival by modulating TIF-IA-mediated pre-ribosomal RNA synthesis under uridine downregulated conditions.

    PubMed

    Liu, Fakeng; Jin, Rui; Liu, Xiuju; Huang, Henry; Wilkinson, Scott C; Zhong, Diansheng; Khuri, Fadlo R; Fu, Haian; Marcus, Adam; He, Yulong; Zhou, Wei

    2016-01-19

    We analyzed the mechanism underlying 5-aminoimidazole-4-carboxamide riboside (AICAR) mediated apoptosis in LKB1-null non-small cell lung cancer (NSCLC) cells. Metabolic profile analysis revealed depletion of the intracellular pyrimidine pool after AICAR treatment, but uridine was the only nucleotide precursor capable of rescuing this apoptosis, suggesting the involvement of RNA metabolism. Because half of RNA transcription in cancer is for pre-ribosomal RNA (rRNA) synthesis, which is suppressed by over 90% after AICAR treatment, we evaluated the role of TIF-IA-mediated rRNA synthesis. While the depletion of TIF-IA by RNAi alone promoted apoptosis in LKB1-null cells, the overexpression of a wild-type or a S636A TIF-IA mutant, but not a S636D mutant, attenuated AICAR-induced apoptosis. In LKB1-null H157 cells, pre-rRNA synthesis was not suppressed by AICAR when wild-type LKB1 was present, and cellular fractionation analysis indicated that TIF-IA quickly accumulated in the nucleus in the presence of a wild-type LKB1 but not a kinase-dead mutant. Furthermore, ectopic expression of LKB1 was capable of attenuating AICAR-induced death in AMPK-null cells. Because LKB1 promotes cell survival by modulating TIF-IA-mediated pre-rRNA synthesis, this discovery suggested that targeted depletion of uridine related metabolites may be exploited in the clinic to eliminate LKB1-null cancer cells.

  9. Processing of on-board recorded data for quick analysis of aircraft performance. [rotor systems research aircraft

    NASA Technical Reports Server (NTRS)

    Michaud, N. H.

    1979-01-01

    A system of independent computer programs for the processing of digitized pulse code modulated (PCM) and frequency modulated (FM) data is described. Information is stored in a set of random files and accessed to produce both statistical and graphical output. The software system is designed primarily to present these reports within a twenty-four hour period for quick analysis of the helicopter's performance.

  10. Novel architecture for data management and control for small satellite

    NASA Astrophysics Data System (ADS)

    Adami, G.; Fossati, D.; Turri, M.

    1995-12-01

    The paper introduces an innovative architecture for the on-board units that are responsible to provide the data interface, control and processing capability normally allocated in separated electronics boxes in the data handling subsystem of the space system. A new solution for the attitude control of the space vehicle has been studied and developed and the utilization of this technological growth, in particular that concerns the GPS receiver, is matter for novel architecture. This new approach also involves in general the small satellite ground segment product as matter of a dedicated development approach. Small and medium satellites are considered an attractive solution for the low cost scientific experimentation, communication or remote sensing satellites. The functional and performance capability of the studied on-board units and ground segment are assessed in tight conjunction with the evolution of the European and the USA market. The design of these units has to be based on few and simple driving requirements, directly derived from the new modified scenario: (1) The limited budgets available for space system. (2) The quick mission data return, i.e., low development time by specific and tailored system development tools. The quick availability of data to scientists/user is requested without jeopardizing the maximum and guaranteed scientific or commercial return. The proposed system is then given thinking to an architecture based on a high degree of modularity (and reuse of existing library of modules) thus allowing to keep down costs and to speed up the time to market. The design ground rules are so established in order to cope with the following performance: (1) capability to adapt with few impacts the system interfaces, in particular for attitude sensors and actuators that are tightly mission dependent; (2) easy adaptation of on board computational performances and memory capacity (including mass memory storage capability); (3) definition of a hierarchical and modular software design for the same rationale explained for the hardware.

  11. Designed to Win: An Agile Approach to Air Force Command and Control of Cyberspace

    DTIC Science & Technology

    2010-06-01

    capabilities and limitations of technology with a level of control that synchronizes operations, yet allows independent action to take advantage of...was during the Roman Empire. With the exception of the semaphore telegraph and an improved road network, the same methods of communication used by...48 To provide information quickly to the ground commanders, the Aviation Section of the US Signal Corps installed primitive wireless radio sets in

  12. A general-purpose balloon-borne pointing system for solar scientific instruments

    NASA Technical Reports Server (NTRS)

    Polites, M. E.

    1990-01-01

    A general purpose balloonborne pointing system for accommodating a wide variety of solar scientific instruments is described. It is designed for precise pointing, low cost, and quick launch. It offers the option of three-axis control, pitch-yaw-roll, or two-axis control, pitch-yaw, depending on the needs of the solar instrument. Simulation results are presented that indicate good pointing capability at Sun elevation angles ranging from 10 to 80 deg.

  13. High Performance Object-Oriented Scientific Programming in Fortran 90

    NASA Technical Reports Server (NTRS)

    Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.

    1997-01-01

    We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.

  14. On the intrinsic sterility of 3D printing

    PubMed Central

    Flynn, Kaitlin J.; Zaman, Luis; Tung, Emily; Pudlo, Nicholas

    2016-01-01

    3D printers that build objects using extruded thermoplastic are quickly becoming commonplace tools in laboratories. We demonstrate that with appropriate handling, these devices are capable of producing sterile components from a non-sterile feedstock of thermoplastic without any treatment after fabrication. The fabrication process itself results in sterilization of the material. The resulting 3D printed components are suitable for a wide variety of applications, including experiments with bacteria and cell culture. PMID:27920950

  15. Aerospace Expeditionary Force Implementation and the Effect on Team Cohesion

    DTIC Science & Technology

    2002-03-01

    quickly load UTCs during real world conflicts, deployments, or exercises. The TPFDD is the Joint Operation Planning and Execution System data base... system in mission capable status in an efficeint manner. Time is saved by not having to establish work/personal relationships and new work peocesses...there wasn’t enough time for the system to process backfills for shortfalls that were, in reality, not shortfalls at all but a rainbow package. While

  16. Local Observability Analysis of Star Sensor Installation Errors in a SINS/CNS Integration System for Near-Earth Flight Vehicles.

    PubMed

    Yang, Yanqiang; Zhang, Chunxi; Lu, Jiazhen

    2017-01-16

    Strapdown inertial navigation system/celestial navigation system (SINS/CNS) integrated navigation is a fully autonomous and high precision method, which has been widely used to improve the hitting accuracy and quick reaction capability of near-Earth flight vehicles. The installation errors between SINS and star sensors have been one of the main factors that restrict the actual accuracy of SINS/CNS. In this paper, an integration algorithm based on the star vector observations is derived considering the star sensor installation error. Then, the star sensor installation error is accurately estimated based on Kalman Filtering (KF). Meanwhile, a local observability analysis is performed on the rank of observability matrix obtained via linearization observation equation, and the observable conditions are presented and validated. The number of star vectors should be greater than or equal to 2, and the times of posture adjustment also should be greater than or equal to 2. Simulations indicate that the star sensor installation error could be readily observable based on the maneuvering condition; moreover, the attitude errors of SINS are less than 7 arc-seconds. This analysis method and conclusion are useful in the ballistic trajectory design of near-Earth flight vehicles.

  17. Influence of weld-induced residual stresses on the hysteretic behavior of a girth-welded circular stainless steel tube

    NASA Astrophysics Data System (ADS)

    Lee, Chin-Hyung; Nguyen Van Do, Vuong; Chang, Kyong-Ho; Jeon, Jun-Tai; Um, Tae-Hwan

    2018-04-01

    The present study attempts to characterize the relevance of welding residual stresses to the hysteretic behaviour of a girth-welded circular stainless steel tube under cyclic mechanical loadings. Finite element (FE) thermal simulation of the girth butt welding process is first performed to identify the weld-induced residual stresses by using the one-way coupled three-dimensional (3-D) thermo-mechanical FE analysis method. 3-D elastic-plastic FE analysis equipped with the cyclic plasticity constitutive model capable of describing the cyclic response is next carried out to scrutinize the effects that the residual stresses have on the hysteretic performance of the girth-welded steel tube exposed to cyclic axial loading, which takes the residual stresses and plastic strains calculated from the preceding thermo-mechanical analysis as the initial condition. The analytical results demonstrate that the residual stresses bring about premature yielding and deterioration of the load carrying capacity in the elastic and the transition load ranges, whilst the residual stress effect is wiped out quickly in the plastic load domain since the residual stresses are nearly wholly relaxed after application of the cyclic plastic loading.

  18. Life without water: cross-resistance of anhydrobiotic cell line to abiotic stresses

    NASA Astrophysics Data System (ADS)

    Gusev, Oleg

    2016-07-01

    Anhydrobiosis is an intriguing phenomenon of natural ability of some organisms to resist water loss. The larvae of Polypedilum vanderplanki, the sleeping chironomid is the largest and most complex anhydrobionts known to date. The larvae showed ability to survive variety of abiotic stresses, including outer space environment. Recently cell line (Pv11) derived from the embryonic mass of the chironomid was established. Initially sensitive to desiccation cells, are capable to "induced" anhydrobiosis, when the resistance to desiccation can be developed by pre-treatment of the cells with trehalose followed by quick desiccation. We have further conducted complex analysis of the whole genome transcription response of Pv11 cells to different abiotic stresses, including oxidative stress and irradiation. Comparative analysis showed that the gene set, responsible for formation of desiccation resistance (ARID regions in the genome) is also activated in response to other types of stresses and likely to contribute to general enhancing of the resistance of the cells to harsh environment. We have further demonstrated that the cells are able to protect recombinant proteins from harmful effect of desiccation

  19. C3: A Command-line Catalogue Cross-matching tool for modern astrophysical survey data

    NASA Astrophysics Data System (ADS)

    Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio

    2017-06-01

    In the current data-driven science era, it is needed that data analysis techniques has to quickly evolve to face with data whose dimensions has increased up to the Petabyte scale. In particular, being modern astrophysics based on multi-wavelength data organized into large catalogues, it is crucial that the astronomical catalog cross-matching methods, strongly dependant from the catalogues size, must ensure efficiency, reliability and scalability. Furthermore, multi-band data are archived and reduced in different ways, so that the resulting catalogues may differ each other in formats, resolution, data structure, etc, thus requiring the highest generality of cross-matching features. We present C 3 (Command-line Catalogue Cross-match), a multi-platform application designed to efficiently cross-match massive catalogues from modern surveys. Conceived as a stand-alone command-line process or a module within generic data reduction/analysis pipeline, it provides the maximum flexibility, in terms of portability, configuration, coordinates and cross-matching types, ensuring high performance capabilities by using a multi-core parallel processing paradigm and a sky partitioning algorithm.

  20. Space rescue system definition (system performance analysis and trades)

    NASA Astrophysics Data System (ADS)

    Housten, Sam; Elsner, Tim; Redler, Ken; Svendsen, Hal; Wenzel, Sheri

    This paper addresses key technical issues involved in the system definition of the Assured Crew Return Vehicle (ACRV). The perspective on these issues is that of a prospective ACRV contractor, performing system analysis and trade studies. The objective of these analyses and trade studies is to develop the recovery vehicle system concept and top level requirements. The starting point for this work is the definition of the set of design missions for the ACRV. This set of missions encompasses three classes of contingency/emergency (crew illness/injury, space station catastrophe/failure, transportation element catastrophe/failure). The need is to provide a system to return Space Station crew to Earth quickly (less than 24 hours) in response to randomly occurring contingency events over an extended period of time (30 years of planned Space Station life). The main topics addressed and characterized in this paper include the following: Key Recovery (Rescue) Site Access Considerations; Rescue Site Locations and Distribution; Vehicle Cross Range vs Site Access; On-orbit Loiter Capability and Vehicle Design; and Water vs. Land Recovery.

  1. The emergence of spatial cyberinfrastructure.

    PubMed

    Wright, Dawn J; Wang, Shaowen

    2011-04-05

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge.

  2. The emergence of spatial cyberinfrastructure

    PubMed Central

    Wright, Dawn J.; Wang, Shaowen

    2011-01-01

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge. PMID:21467227

  3. Exploitation of multi-temporal Earth Observation imagery for monitoring land cover change in mining sites

    NASA Astrophysics Data System (ADS)

    Petropoulos, G.; Partsinevelos, P.; Mitraka, Z.

    2012-04-01

    Surface mining has been shown to cause intensive environmental degradation in terms of landscape, vegetation and biological communities. Nowadays, the commercial availability of remote sensing imagery at high spatiotemporal scales, has improved dramatically our ability to monitor surface mining activity and evaluate its impact on the environment and society. In this study we investigate the potential use of Landsat TM imagery combined with diverse classification techniques, namely artificial neural networks and support vector machines for delineating mining exploration and assessing its effect on vegetation in various surface mining sites in the Greek island of Milos. Assessment of the mining impact in the study area is validated through the analysis of available QuickBird imagery acquired nearly concurrently to the TM overpasses. Results indicate the capability of the TM sensor combined with the image analysis applied herein as a potential economically viable solution to provide rapidly and at regular time intervals information on mining activity and its impact to the local environment. KEYWORDS: mining environmental impact, remote sensing, image classification, change detection, land reclamation, support vector machines, neural networks

  4. Rapid screening of basic colorants in processed vegetables through mass spectrometry using an interchangeable thermal desorption electrospray ionization source.

    PubMed

    Chao, Yu-Ying; Chen, Yen-Ling; Lin, Hong-Yi; Huang, Yeou-Lih

    2018-06-20

    Thermal desorption electrospray ionization/mass spectrometry (TD-ESI-MS) employing a quickly interchangeable ionization source is a relatively new ambient ionization mass spectrometric technique that has had, to date, only a limited number of applications related to food safety control. With reallocation of resources, this direct-analysis technique has had wider use in food analysis when operated in dual-working mode (pretreatment-free qualitative screening and conventional quantitative confirmation) after switching to an ambient ionization source from a traditional atmospheric pressure ionization source. Herein, we describe the benefits and challenges associated with the use of a TD-ESI source to detect adulterants in processed vegetables (PVs), as a proof-of-concept for the detection of basic colorants. While TD-ESI can offer direct qualitative screening analyses for PVs with detection capabilities lower than those provided with liquid chromatography/UV detection within 30 s, the use of TD-ESI for semi-quantification is applicable only for homogeneous food matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Distributed collaborative environments for predictive battlespace awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.

  6. Velo and REXAN - Integrated Data Management and High Speed Analysis for Experimental Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleese van Dam, Kerstin; Carson, James P.; Corrigan, Abigail L.

    2013-01-10

    The Chemical Imaging Initiative at the Pacific Northwest National Laboratory (PNNL) is creating a ‘Rapid Experimental Analysis’ (REXAN) Framework, based on the concept of reusable component libraries. REXAN allows developers to quickly compose and customize high throughput analysis pipelines for a range of experiments, as well as supporting the creation of multi-modal analysis pipelines. In addition, PNNL has coupled REXAN with its collaborative data management and analysis environment Velo to create an easy to use data management and analysis environments for experimental facilities. This paper will discuss the benefits of Velo and REXAN in the context of three examples: PNNLmore » High Resolution Mass Spectrometry - reducing analysis times from hours to seconds, and enabling the analysis of much larger data samples (100KB to 40GB) at the same time · ALS X-Ray tomography - reducing analysis times of combined STXM and EM data collected at the ALS from weeks to minutes, decreasing manual work and increasing data volumes that can be analysed in a single step ·Multi-modal nano-scale analysis of STXM and TEM data - providing a semi automated process for particle detection The creation of REXAN has significantly shortened the development time for these analysis pipelines. The integration of Velo and REXAN has significantly increased the scientific productivity of the instruments and their users by creating easy to use data management and analysis environments with greatly reduced analysis times and improved analysis capabilities.« less

  7. Quick analysis of optical spectra to quantify epidermal melanin and papillary dermal blood content of skin.

    PubMed

    Jacques, Steven L

    2015-04-01

    This paper presents a practical approach for assessing the melanin and blood content of the skin from total diffuse reflectance spectra, R(λ), where λ is wavelength. A quick spectral analysis using just three wavelengths (585 nm, 700 nm and 800 nm) is presented, based on the 1985 work of Kollias and Baquer who documented epidermal melanin of skin using the slope of optical density (OD) between 620 nm and 720 nm. The paper describes the non-rectilinear character of such a quick analysis, and shows that almost any choice of two wavelengths in the 600-900 range can achieve the characterization of melanin. The extrapolation of the melanin slope to 585 nm serves as a baseline for subtraction from the OD (585 nm) to yield a blood perfusion score. Monte Carlo simulations created spectral data for a skin model with epidermis, papillary dermis and reticular dermis to illustrate the analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. The on-site quality-assurance system for Hyper Suprime-Cam: OSQAH

    NASA Astrophysics Data System (ADS)

    Furusawa, Hisanori; Koike, Michitaro; Takata, Tadafumi; Okura, Yuki; Miyatake, Hironao; Lupton, Robert H.; Bickerton, Steven; Price, Paul A.; Bosch, James; Yasuda, Naoki; Mineo, Sogo; Yamada, Yoshihiko; Miyazaki, Satoshi; Nakata, Fumiaki; Koshida, Shintaro; Komiyama, Yutaka; Utsumi, Yousuke; Kawanomoto, Satoshi; Jeschke, Eric; Noumaru, Junichi; Schubert, Kiaina; Iwata, Ikuru; Finet, Francois; Fujiyoshi, Takuya; Tajitsu, Akito; Terai, Tsuyoshi; Lee, Chien-Hsiu

    2018-01-01

    We have developed an automated quick data analysis system for data quality assurance (QA) for Hyper Suprime-Cam (HSC). The system was commissioned in 2012-2014, and has been offered for general observations, including the HSC Subaru Strategic Program, since 2014 March. The system provides observers with data quality information, such as seeing, sky background level, and sky transparency, based on quick analysis as data are acquired. Quick-look images and validation of image focus are also provided through an interactive web application. The system is responsible for the automatic extraction of QA information from acquired raw data into a database, to assist with observation planning, assess progress of all observing programs, and monitor long-term efficiency variations of the instrument and telescope. Enhancements of the system are being planned to facilitate final data analysis, to improve the HSC archive, and to provide legacy products for astronomical communities.

  9. AstroImageJ: Image Processing and Photometric Extraction for Ultra-precise Astronomical Light Curves

    NASA Astrophysics Data System (ADS)

    Collins, Karen A.; Kielkopf, John F.; Stassun, Keivan G.; Hessman, Frederic V.

    2017-02-01

    ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard Flexible Image Transport System (FITS) files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System aware, including an automated interface to the astrometry.net web portal for plate solving images. AIJ provides research grade image calibration and analysis tools with a GUI driven approach, and easily installed cross-platform compatibility. It enables new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.

  10. Noninvasive ultrasonic examination technology in support of counter-terrorism and drug interdiction activities: the acoustic inspection device (AID)

    NASA Astrophysics Data System (ADS)

    Diaz, Aaron A.; Burghard, Brion J.; Skorpik, James R.; Shepard, Chester L.; Samuel, Todd J.; Pappas, Richard A.

    2003-07-01

    The Pacific Northwest National Laboratory (PNNL) has developed a portable, battery-operated, handheld ultrasonic device that provides non-invasive container interrogation and material identification capabilities. The technique governing how the acoustic inspection device (AID) functions, involves measurements of ultrasonic pulses (0.1 to 5 MHz) that are launched into a container or material. The return echoes from these pulses are analyzed in terms of time-of-flight and frequency content to extract physical property measurements (the acoustic velocity and attenuation coefficient) of the material under test. The AID performs an automated analysis of the return echoes to identify the material, and detect contraband in the form of submerged packages and concealed compartments in liquid filled containers and solid-form commodities. An inspector can quickly interrogate outwardly innocuous commodity items such as shipping barrels, tanker trucks, and metal ingots. The AID can interrogate container sizes ranging from approximately 6 inches in diameter to over 96 inches in diameter and allows the inspector to sort liquid and material types into groups of like and unlike; a powerful method for discovering corrupted materials or miss-marked containers co-mingled in large shipments. This manuscript describes the functionality, capabilities and measurement methodology of the technology as it relates to homeland security applications.

  11. Using MCBEND for neutron or gamma-ray deterministic calculations

    NASA Astrophysics Data System (ADS)

    Geoff, Dobson; Adam, Bird; Brendan, Tollit; Paul, Smith

    2017-09-01

    MCBEND 11 is the latest version of the general radiation transport Monte Carlo code from AMEC Foster Wheeler's ANSWERS® Software Service. MCBEND is well established in the UK shielding community for radiation shielding and dosimetry assessments. MCBEND supports a number of acceleration techniques, for example the use of an importance map in conjunction with Splitting/Russian Roulette. MCBEND has a well established automated tool to generate this importance map, commonly referred to as the MAGIC module using a diffusion adjoint solution. This method is fully integrated with the MCBEND geometry and material specification, and can easily be run as part of a normal MCBEND calculation. An often overlooked feature of MCBEND is the ability to use this method for forward scoping calculations, which can be run as a very quick deterministic method. Additionally, the development of the Visual Workshop environment for results display provides new capabilities for the use of the forward calculation as a productivity tool. In this paper, we illustrate the use of the combination of the old and new in order to provide an enhanced analysis capability. We also explore the use of more advanced deterministic methods for scoping calculations used in conjunction with MCBEND, with a view to providing a suite of methods to accompany the main Monte Carlo solver.

  12. Performance monitoring algorithm for optimizing electrical power generated by using photovoltaic system

    NASA Astrophysics Data System (ADS)

    Pradeep, M. V. K.; Balbir, S. M. S.; Norani, M. M.

    2016-11-01

    Demand for electricity in Malaysia has seen a substantial hike in light of the nation's rapid economic development. The current method of generating electricity is through the combustion of fossil fuels which has led to the detrimental effects on the environment besides causing social and economic outbreaks due to its highly volatile prices. Thus the need for a sustainable energy source is paramount and one that is quickly gaining acceptance is solar energy. However, due to the various environmental and geographical factors that affect the generation of solar electricity, the capability of solar electricity generating system (SEGS) is unable to compete with the high conversion efficiencies of conventional energy sources. In order to effectively monitor SEGS, this study is proposing a performance monitoring system that is capable of detecting drops in the system's performance for parallel networks through a diagnostic mechanism. The performance monitoring system consists of microcontroller connected to relevant sensors for data acquisition. The acquired data is transferred to a microcomputer for software based monitoring and analysis. In order to enhance the interception of sunlight by the SEGS, a sensor based sun tracking system is interfaced to the same controller to allow the PV to maneuver itself autonomously to an angle of maximum sunlight exposure.

  13. The Hubble Spectroscopic Legacy Archive

    NASA Astrophysics Data System (ADS)

    Peeples, M.; Tumlinson, J.; Fox, A.; Aloisi, A.; Fleming, S.; Jedrzejewski, R.; Oliveira, C.; Ayres, T.; Danforth, C.; Keeney, B.; Jenkins, E.

    2017-04-01

    With no future space ultraviolet instruments currently planned, the data from the UV spectrographs aboard the Hubble Space Telescope have a legacy value beyond their initial science goals. The goal of the Hubble Spectroscopic Legacy Archive(HSLA) is to provide to the community new science-grade combined spectra for all publicly available data obtained by the Cosmic Origins Spectrograph (COS)and the Space Telescope Imaging Spectrograph (STIS). These data are packaged into "smart archives" according to target type and scientific themes to facilitate the construction of archival samples for common science uses. A new "quick look" capability makes the data easy for users to quickly access, assess the quality of,and download for archival science. The first generation of these products for the far-ultraviolet (FUV) modes of COS was made available online via the Mikulski Archive for Space Telescopes (MAST) in early 2016 and updated in early 2017; future releases will include COS/NUV and STIS/UV data.

  14. Review of nanostructured devices for thermoelectric applications

    PubMed Central

    2014-01-01

    Summary A big research effort is currently dedicated to the development of thermoelectric devices capable of a direct thermal-to-electrical energy conversion, aiming at efficiencies as high as possible. These devices are very attractive for many applications in the fields of energy recovery and green energy harvesting. In this paper, after a quick summary of the fundamental principles of thermoelectricity, the main characteristics of materials needed for high efficiency thermoelectric conversion will be discussed, and a quick review of the most promising materials currently under development will be given. This review paper will put a particular emphasis on nanostructured silicon, which represents a valid compromise between good thermoelectric properties on one side and material availability, sustainability, technological feasibility on the other side. The most important bottom-up and top-down nanofabrication techniques for large area silicon nanowire arrays, to be used for high efficiency thermoelectric devices, will be presented and discussed. PMID:25247111

  15. LTCP 2D Graphical User Interface. Application Description and User's Guide

    NASA Technical Reports Server (NTRS)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  16. Thioaptamer Diagnostic System (TDS)

    NASA Technical Reports Server (NTRS)

    Yang, Xianbin

    2015-01-01

    AM Biotechnologies, LLC, in partnership with Sandia National Laboratories, has developed a diagnostic device that quickly detects sampled biomarkers. The TDS quickly quantifies clinically relevant biomarkers using only microliters of a single sample. The system combines ambient-stable, long shelf-life affinity assays with handheld, microfluidic gel electrophoresis affinity assay quantification technology. The TDS is easy to use, operates in microgravity, and permits simultaneous quantification of 32 biomarkers. In Phase I of the project, the partners demonstrated that a thioaptamer assay used in the microfluidic instrument could quantify a specific biomarker in serum in the low nanomolar range. The team also identified novel affinity agents to bone-specific alkaline phosphatase (BAP) and demonstrated their ability to detect BAP with the microfluidic instrument. In Phase II, AM Biotech expanded the number of ambient affinity agents and demonstrated a TDS prototype. In the long term, the clinical version of the TDS will provide a robust, flight-tested diagnostic capability for space exploration missions.

  17. Building complex simulations rapidly using MATRIX(x): The Space Station redesign

    NASA Technical Reports Server (NTRS)

    Carrington, C. K.

    1994-01-01

    MSFC's quick response to the Space Station redesign effort last year required the development of a computer simulation to model the attitude and station-keeping dynamics of a complex body with rotating solar arrays in orbit around the Earth. The simulation was written using a rapid-prototyping graphical simulation and design tool called MATRIX(x) and provided the capability to quickly remodel complex configuration changes by icon manipulation using a mouse. The simulation determines time-dependent inertia properties, and models forces and torques from gravity-gradient, solar radiation, and aerodynamic disturbances. Surface models are easily built from a selection of beams, plates, tetrahedrons, and cylinders. An optimization scheme was written to determine the torque equilibrium attitudes that balance gravity-gradient and aerodynamic torques over an orbit, and propellant-usage estimates were determined. The simulation has been adapted to model the attitude dynamics for small spacecraft.

  18. Study on the adjustment capability of the excitation system located inside superconducting machine electromagnetic shield

    NASA Astrophysics Data System (ADS)

    Xia, D.; Xia, Z.

    2017-12-01

    The ability for the excitation system to adjust quickly plays a very important role in maintaining the normal operation of superconducting machines and power systems. However, the eddy currents in the electromagnetic shield of superconducting machines hinder the exciting magnetic field change and weaken the adjustment capability of the excitation system. To analyze this problem, a finite element calculation model for the transient electromagnetic field with moving parts is established. The effects of three different electromagnetic shields on the exciting magnetic field are analyzed using finite element method. The results show that the electromagnetic shield hinders the field changes significantly, the better its conductivity, the greater the effect on the superconducting machine excitation.

  19. In situ precise electrospinning of medical glue fibers as nonsuture dural repair with high sealing capability and flexibility.

    PubMed

    Lv, Fu-Yan; Dong, Rui-Hua; Li, Zhao-Jian; Qin, Chong-Chong; Yan, Xu; He, Xiao-Xiao; Zhou, Yu; Yan, Shi-Ying; Long, Yun-Ze

    In this work, we propose an in situ precise electrospinning of medical glue fibers onto dural wound for improving sealing capability, avoiding tissue adhesion, and saving time in dural repair. N-octyl-2-cyanoacrylate, a commercial tissue adhesive (medical glue), can be electrospun into ultrathin fibrous film with precise and homogeneous deposition by a gas-assisted electrospinning device. The self-assembled N-octyl-2-cyanoacrylate film shows high compactness and flexibility owing to its fibrous structure. Simulation experiments on egg membranes and goat meninges demonstrated that this technology can repair small membrane defects quickly and efficiently. This method may have potential application in dural repair, for example, working as an effective supplementary technique for conventional dura suture.

  20. A summary of existing and planned experiment hardware for low-gravity fluids research

    NASA Technical Reports Server (NTRS)

    Hill, Myron E.; Omalley, Terence F.

    1991-01-01

    An overview is presented of (1) existing ground-based, low gravity research facilities, with examples of hardware capabilities, and (2) existing and planned space-based research facilities, with examples of current and past flight hardware. Low-gravity, ground-based facilities, such as drop towers and aircraft, provide the experimenter with quick turnaround time, easy access to equipment, gravity levels ranging from 10(exp -2) to 10(exp -6) G, and low-gravity durations ranging from 2 to 30 sec. Currently, the only operational space-based facility is the Space Shuttle. The Shuttle's payload bay and middeck facilities are described. Existing and planned low-gravity fluids research facilities are also described with examples of experiments and hardware capabilities.

  1. X-40A on runway after Free Flight #2A

    NASA Image and Video Library

    2001-04-12

    Second free-flight of the X-40A at the NASA Dryden Flight Research Center, on Edwards AFB, Calif., was made on Apr. 12, 2001. The unpowered X-40A, an 85 percent scale risk reduction version of the proposed X-37, is proving the capability of an autonomous flight control and landing system in a series of glide flights at Edwards. The April 12 flight introduced complex vehicle maneuvers during the landing sequence. The X-40A was released from an Army Chinook helicopter flying 15,050 feet overhead. Ultimately, the unpiloted X-37 is intended as an orbital testbed and technology demonstrator, capable of landing like an airplane and being quickly serviced for a follow-up mission.

  2. Analytic programming with FMRI data: a quick-start guide for statisticians using R.

    PubMed

    Eloyan, Ani; Li, Shanshan; Muschelli, John; Pekar, Jim J; Mostofsky, Stewart H; Caffo, Brian S

    2014-01-01

    Functional magnetic resonance imaging (fMRI) is a thriving field that plays an important role in medical imaging analysis, biological and neuroscience research and practice. This manuscript gives a didactic introduction to the statistical analysis of fMRI data using the R project, along with the relevant R code. The goal is to give statisticians who would like to pursue research in this area a quick tutorial for programming with fMRI data. References of relevant packages and papers are provided for those interested in more advanced analysis.

  3. Pathway Activity Profiling (PAPi): from the metabolite profile to the metabolic pathway activity.

    PubMed

    Aggio, Raphael B M; Ruggiero, Katya; Villas-Bôas, Silas Granato

    2010-12-01

    Metabolomics is one of the most recent omics-technologies and uses robust analytical techniques to screen low molecular mass metabolites in biological samples. It has evolved very quickly during the last decade. However, metabolomics datasets are considered highly complex when used to relate metabolite levels to metabolic pathway activity. Despite recent developments in bioinformatics, which have improved the quality of metabolomics data, there is still no straightforward method capable of correlating metabolite level to the activity of different metabolic pathways operating within the cells. Thus, this kind of analysis still depends on extremely laborious and time-consuming processes. Here, we present a new algorithm Pathway Activity Profiling (PAPi) with which we are able to compare metabolic pathway activities from metabolite profiles. The applicability and potential of PAPi was demonstrated using a previously published data from the yeast Saccharomyces cerevisiae. PAPi was able to support the biological interpretations of the previously published observations and, in addition, generated new hypotheses in a straightforward manner. However, PAPi is time consuming to perform manually. Thus, we also present here a new R-software package (PAPi) which implements the PAPi algorithm and facilitates its usage to quickly compare metabolic pathways activities between different experimental conditions. Using the identified metabolites and their respective abundances as input, the PAPi package calculates pathways' Activity Scores, which represents the potential metabolic pathways activities and allows their comparison between conditions. PAPi also performs principal components analysis and analysis of variance or t-test to investigate differences in activity level between experimental conditions. In addition, PAPi generates comparative graphs highlighting up- and down-regulated pathway activity. These datasets are available in http://www.4shared.com/file/hTWyndYU/extra.html and http://www.4shared.com/file/VbQIIDeu/intra.html. PAPi package is available in: http://www.4shared.com/file/s0uIYWIg/PAPi_10.html s.villas-boas@auckland.ac.nz Supplementary data are available at Bioinformatics online.

  4. The Difference Between Countermovement and Squat Jump Performances: A Review of Underlying Mechanisms With Practical Applications.

    PubMed

    Van Hooren, Bas; Zolotarjova, Julia

    2017-07-01

    Van Hooren, B and Zolotarjova, J. The difference between countermovement and squat jump performances: a review of underlying mechanisms with practical applications. J Strength Cond Res 31(7): 2011-2020, 2017-Two movements that are widely used to monitor athletic performance are the countermovement jump (CMJ) and squat jump (SJ). Countermovement jump performance is almost always better than SJ performance, and the difference in performance is thought to reflect an effective utilization of the stretch-shortening cycle. However, the mechanisms responsible for the performance-enhancing effect of the stretch-shortening cycle are frequently undefined. Uncovering and understanding these mechanisms is essential to make an inference regarding the difference between the jumps. Therefore, we will review the potential mechanisms that explain the better performance in a CMJ as compared with a SJ. It is concluded that the difference in performance may primarily be related to the greater uptake of muscle slack and the buildup of stimulation during the countermovement in a CMJ. Elastic energy may also have a small contribution to an enhanced CMJ performance. Therefore, a larger difference between the jumps is not necessarily a better indicator of high-intensity sports performance. Although a larger difference may reflect the utilization of elastic energy in a small-amplitude CMJ as a result of a well-developed capability to co-activate muscles and quickly build up stimulation, a larger difference may also reflect a poor capability to reduce the degree of muscle slack and build up stimulation in the SJ. Because the capability to reduce the degree of muscle slack and quickly build up stimulation in the SJ may be especially important to high-intensity sports performance, training protocols might concentrate on attaining a smaller difference between the jumps.

  5. Brahman

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, D. B.

    2015-01-30

    The Adversary & Interdiction Methods (AIM) program provides training and capability assessment services to government agencies around the country. Interdisciplinary teams equipped with gear and radioactive sources are repeatedly fielded to offsite events to collaborate with law enforcement agencies at all levels of government. AIM has grown rapidly over the past three years. A knowledge management system as evolved along with the program but it has failed to keep pace. A new system is needed. The new system must comply with cybersecurity and information technology solutions already in place at an institutional level. The offsite nature of AIM activities mustmore » also be accommodated. Cost and schedule preclude the commissioning of new software and the procurement of expensive hardware. The new system must exploit in-house capabilities and be established quickly. A novel system is proposed. This solution centers on a recently introduced institutional file sharing capability called Syncplicity. AIM-authored software will be combined with a dedicated institutional account to vastly extend the capability of this resource. The new knowledge management system will reduce error and increase efficiency through automation and be accessible offsite via mobile devices.« less

  6. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  7. Site characterization and analysis penetrometer system

    NASA Astrophysics Data System (ADS)

    Heath, Jeff

    1995-04-01

    The site characterization and analysis penetrometer system (SCAPS) with laser induced fluorescence (LIF) sensors is being demonstrated as a quick field screening technique to determine the physical and chemical characteristics of subsurface soil and contaminants at hazardous waste sites SCAPS is a collaborative development effort of the Navy, Army, and Air Force under the Tri-Service SCAPS Program. The current SCAPS configuration is designed to quickly and cost-effectively distinguish areas contaminated with petroleum products (hydrocarbons) from unaffected areas.

  8. Relationships between eating quickly and weight gain in Japanese university students: a longitudinal study.

    PubMed

    Yamane, Mayu; Ekuni, Daisuke; Mizutani, Shinsuke; Kataoka, Kota; Sakumoto-Kataoka, Masami; Kawabata, Yuya; Omori, Chie; Azuma, Tetsuji; Tomofuji, Takaaki; Iwasaki, Yoshiaki; Morita, Manabu

    2014-10-01

    Many cross-sectional studies have reported a relationship between overweight/obesity and eating quickly, but there have been few longitudinal studies to address this relationship in younger populations. The purpose of this prospective longitudinal study was to investigate whether eating quickly was related to being overweight in Japanese university students. Of 1,396 students who underwent a general examination and completed questionnaires at the start of university and before graduation, 1,314 students (676 male and 638 female) of normal body composition [body mass index (BMI) < 25 kg m(-2) ] at baseline were included in the analysis. The questionnaires included speed of eating and other lifestyle factors. After a 3-year follow-up, the students whose BMIs were ≥ 25 kg m(-2) were defined as overweight. In this study, 38 participants (2.9%) became overweight. In the logistic regression analysis, the risk of being overweight was increased in males [adjusted odds ratio (OR): 2.77; 95% confidence interval (CI): 1.33-5.79; P < 0.01] and in those who ate quickly at baseline (OR: 4.40; 95% CI: 2.22-8.75; P < 0.001). Eating quickly may predict risk of being overweight in Japanese university students. Copyright © 2014 The Obesity Society.

  9. Strategies for responding to RAC requests electronically.

    PubMed

    Schramm, Michael

    2012-04-01

    Providers that would like to respond to complex RAC reviews electronically should consider three strategies: Invest in an EHR software package or a high-powered scanner that can quickly scan large amounts of paper. Implement an audit software platform that will allow providers to manage the entire audit process in one place. Use a CONNECT-compatible gateway capable of accessing the Nationwide Health Information Network (the network on which the electronic submission of medical documentation program runs).

  10. Video instrumentation for radionuclide angiocardiography.

    NASA Technical Reports Server (NTRS)

    Kriss, J. P.

    1973-01-01

    Two types of videoscintiscopes for performing radioisotopic angiocardiography with a scintillation camera are described, and use of these instruments in performing clinical studies is illustrated. Radionuclide angiocardiography is a simple, quick and accurate procedure recommended as a screening test for patients with a variety of congenital and acquired cardiovascular lesions. When performed in conjunction with coronary arterial catheterization, dynamic radionuclide angiography may provide useful information about regional myocardial perfusion. Quantitative capabilities greatly enhance the potential of this diagnostic tool.

  11. Testing and numerical modeling of hypervelocity impact damaged Space Station multilayer insulation

    NASA Technical Reports Server (NTRS)

    Rule, William K.

    1992-01-01

    Results are presented of experiments measuring the degradation of the insulating capabilities of the multilayer insulation (MLI) of the Space Station Freedom, when subjected to hypervelocity impact damage. A simple numerical model was developed for use in an engineering design environment for quick assessment of thermal effect of the impact. The model was validated using results from thermal vacuum tests on MLI with simulated damage. The numerical model results agreed with experimental data.

  12. Using location tracking data to assess efficiency in established clinical workflows.

    PubMed

    Meyer, Mark; Fairbrother, Pamela; Egan, Marie; Chueh, Henry; Sandberg, Warren S

    2006-01-01

    Location tracking systems are becoming more prevalent in clinical settings yet applications still are not common. We have designed a system to aid in the assessment of clinical workflow efficiency. Location data is captured from active RFID tags and processed into usable data. These data are stored and presented visually with trending capability over time. The system allows quick assessments of the impact of process changes on workflow, and isolates areas for improvement.

  13. Vision for Future Buildings

    ScienceCinema

    None

    2018-01-16

    In the last 10 years our lives have changed so quickly, and drastically, that it's hard to imagine what the distant future might bring. Because buildings are large, long-term investments, the building sector has been slower to change. It can take more than 100 years for our cities to be renovated or rebuilt using updated methods and technologies. But to get there, we must start to conceptualize what the functions and capabilities of these future buildings could be, today.

  14. Vision for Future Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2016-05-19

    In the last 10 years our lives have changed so quickly, and drastically, that it's hard to imagine what the distant future might bring. Because buildings are large, long-term investments, the building sector has been slower to change. It can take more than 100 years for our cities to be renovated or rebuilt using updated methods and technologies. But to get there, we must start to conceptualize what the functions and capabilities of these future buildings could be, today.

  15. Liquid Methane Conditioning Capabilities Developed at the NASA Glenn Research Center's Small Multi- Purpose Research Facility (SMiRF) for Accelerated Lunar Surface Storage Thermal Testing

    NASA Technical Reports Server (NTRS)

    Bamberger, Helmut H.; Robinson, R. Craig; Jurns, John M.; Grasl, Steven J.

    2011-01-01

    Glenn Research Center s Creek Road Cryogenic Complex, Small Multi-Purpose Research Facility (SMiRF) recently completed validation / checkout testing of a new liquid methane delivery system and liquid methane (LCH4) conditioning system. Facility checkout validation was conducted in preparation for a series of passive thermal control technology tests planned at SMiRF in FY10 using a flight-like propellant tank at simulated thermal environments from 140 to 350K. These tests will validate models and provide high quality data to support consideration of LCH4/LO2 propellant combination option for a lunar or planetary ascent stage.An infrastructure has been put in place which will support testing of large amounts of liquid methane at SMiRF. Extensive modifications were made to the test facility s existing liquid hydrogen system for compatibility with liquid methane. Also, a new liquid methane fluid conditioning system will enable liquid methane to be quickly densified (sub-cooled below normal boiling point) and to be quickly reheated to saturation conditions between 92 and 140 K. Fluid temperatures can be quickly adjusted to compress the overall test duration. A detailed trade study was conducted to determine an appropriate technique to liquid conditioning with regard to the SMiRF facility s existing infrastructure. In addition, a completely new roadable dewar has been procured for transportation and temporary storage of liquid methane. A new spherical, flight-representative tank has also been fabricated for integration into the vacuum chamber at SMiRF. The addition of this system to SMiRF marks the first time a large-scale liquid methane propellant test capability has been realized at Glenn.This work supports the Cryogenic Fluid Management Project being conducted under the auspices of the Exploration Technology Development Program, providing focused cryogenic fluid management technology efforts to support NASA s future robotic or human exploration missions.

  16. Modular System to Enable Extravehicular Activity

    NASA Technical Reports Server (NTRS)

    Sargusingh, Miriam J.

    2012-01-01

    The ability to perform extravehicular activity (EVA), both human and robotic, has been identified as a key component to space missions to support such operations as assembly and maintenance of space systems (e.g. construction and maintenance of the International Space Station), and unscheduled activities to repair an element of the transportation and habitation systems that can only be accessed externally and via unpressurized areas. In order to make human transportation beyond lower Earth orbit (LEO) practical, efficiencies must be incorporated into the integrated transportation systems to reduce system mass and operational complexity. Affordability is also a key aspect to be considered in space system development; this could be achieved through commonality, modularity and component reuse. Another key aspect identified for the EVA system was the ability to produce flight worthy hardware quickly to support early missions and near Earth technology demonstrations. This paper details a conceptual architecture for a modular EVA system that would meet these stated needs for EVA capability that is affordable, and that could be produced relatively quickly. Operational concepts were developed to elaborate on the defined needs, and to define the key capabilities, operational and design constraints, and general timelines. The operational concept lead to a high level design concept for a module that interfaces with various space transportation elements and contains the hardware and systems required to support human and telerobotic EVA; the module would not be self-propelled and would rely on an interfacing element for consumable resources. The conceptual architecture was then compared to EVA Systems used in the Space Shuttle Orbiter, on the International Space Station to develop high level design concepts that incorporate opportunities for cost savings through hardware reuse, and quick production through the use of existing technologies and hardware designs. An upgrade option was included to make use of the developing suit port technologies.

  17. Increased energy efficiency of a steel foundry plant by using a cleaner production quick-E-scan methodology

    NASA Astrophysics Data System (ADS)

    Rasmeni, Zelda; Pan, Xiaowei

    2017-07-01

    The Quick-E-Scan methodology is a simple and quick method that is used to achieve operational energy efficiency as opposed to detailed energy audits, which therefore offers a no cost or less cost solutions for energy management programs with a limited budget. The quick-E-scan methodology was used to assesses a steel foundry plant based in Benoni through dividing the foundry into production sections which entailed a review of the current processes and usage patterns of energy within the plant and a detailed analysis of options available for improvement and profitable areas in which energy saving measures may be implemented for an increase energy efficiency which can be presented to management of the company.

  18. Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs)

    NASA Astrophysics Data System (ADS)

    Hatfield, M. C.; Webley, P.; Saiet, E., II

    2014-12-01

    Remote Sensing of Arctic Environmental Conditions and Critical Infrastructure using Infra-Red (IR) Cameras and Unmanned Air Vehicles (UAVs) Numerous scientific and logistical applications exist in Alaska and other arctic regions requiring analysis of expansive, remote areas in the near infrared (NIR) and thermal infrared (TIR) bands. These include characterization of wild land fire plumes and volcanic ejecta, detailed mapping of lava flows, and inspection of lengthy segments of critical infrastructure, such as the Alaska pipeline and railroad system. Obtaining timely, repeatable, calibrated measurements of these extensive features and infrastructure networks requires localized, taskable assets such as UAVs. The Alaska Center for Unmanned Aircraft Systems Integration (ACUASI) provides practical solutions to these problem sets by pairing various IR sensors with a combination of fixed-wing and multi-rotor air vehicles. Fixed-wing assets, such as the Insitu ScanEagle, offer long reach and extended duration capabilities to quickly access remote locations and provide enduring surveillance of the target of interest. Rotary-wing assets, such as the Aeryon Scout or the ACUASI-built Ptarmigan hexcopter, provide a precision capability for detailed horizontal mapping or vertical stratification of atmospheric phenomena. When included with other ground capabilities, we will show how they can assist in decision support and hazard assessment as well as giving those in emergency management a new ability to increase knowledge of the event at hand while reducing the risk to all involved. Here, in this presentation, we illustrate how UAV's can provide the ideal tool to map and analyze the hazardous events and critical infrastructure under extreme environmental conditions.

  19. NeQuick 2 and IRI Plas VTEC predictions for low latitude and South American sector

    NASA Astrophysics Data System (ADS)

    Ezquer, R. G.; Scidá, L. A.; Migoya Orué, Y.; Nava, B.; Cabrera, M. A.; Brunini, C.

    2018-04-01

    Using vertical total electron content (VTEC) measurements obtained from GPS satellite signals the capability of the NeQuick 2 and IRI Plas models to predict VTEC over the low latitude and South American sector is analyzed. In the present work both models were used to calculate VTEC up to the height of GPS satellites. Also, comparisons between the performance of IRI Plas and IRI 2007 have been done. The data correspond to June solstice and September equinox 1999 (high solar activity) and they were obtained at nine stations. The considered latitude range extends from 18.4°N to -64.7°N and the longitude ranges from 281.3°E to 295.9°E in the South American sector. The greatest discrepancies among model predictions and the measured VTEC are obtained at low latitudes stations placed in the equatorial anomaly region. Underestimations as strong as 40 TECU [1 TECU = 1016 m-2] can be observed at BOGT station for September equinox, when NeQuick2 model is used. The obtained results also show that: (a) for June solstice, in general the performance of IRI Plas for low latitude stations is better than that of NeQuick2 and, vice versa, for highest latitudes the performance of NeQuick2 is better than that of IRI Plas. For the stations TUCU and SANT both models have good performance; (b) for September equinox the performances of the models do not follow a clearly defined pattern as in the other season. However, it can be seen that for the region placed between the Northern peak and the valley of the equatorial anomaly, in general, the performance of IRI Plas is better than that of NeQuick2 for hours of maximum ionization. From TUCU to the South, the best TEC predictions are given by NeQuick2. The source of the observed deviations of the models has been explored in terms of CCIR foF2 determination in the available ionosonde stations in the region. Discrepancies can be also related to an unrealistic shape of the vertical electron density profile and or an erroneous prediction of the plasmaspheric contribution to the vertical total electron content. Moreover, the results of this study could be suggesting that in the case of NeQuick, the underestimation trend could be due to the lack of a proper plasmaspheric model in its topside representation. In contrast, the plasmaspheric model included in IRI, leads to clear overestimations of GPS derived TEC.

  20. Air Force Reusable Booster System A Quick-look, Design Focused Modeling and Cost Analysis Study

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar

    2011-01-01

    Presents work supporting the Air force Reusable Booster System (RBS) - A Cost Study with Goals as follows: Support US launch systems decision makers, esp. in regards to the research, technology and demonstration investments required for reusable systems to succeed. Encourage operable directions in Reusable Booster / Launch Vehicle Systems technology choices, system design and product and process developments. Perform a quick-look cost study, while developing a cost model for more refined future analysis.

  1. Quick returns and night work as predictors of sleep quality, fatigue, work-family balance and satisfaction with work hours.

    PubMed

    Dahlgren, Anna; Tucker, Philip; Gustavsson, Petter; Rudman, Ann

    2016-01-01

    Quick returns (intervals of <11 h between the end of one shift and the start of the next) are associated with short sleeps and fatigue on the subsequent shift. Recent evidence suggests that shift workers regard quick returns as being more problematic than night work. The current study explored quick returns and night work in terms of their impact on sleep, unwinding, recovery, exhaustion, satisfaction with work hours and work-family interference. Data from the 2006 cohort of Swedish nursing students within the national Longitudinal Analysis of Nursing Education (LANE) study were analysed (N = 1459). Respondents completed a questionnaire prior to graduation (response rate 69.2%) and 3 years after graduation (65.9%). The analyses examined associations between frequency of quick returns and night work and measures taken in year three, while adjusting for confounding factors (in year three and prior graduation). Frequency of quick returns was a significant predictor of poor sleep quality, short sleeps, unwinding, exhaustion, satisfaction with work hours and work-to-family interference, with higher frequency predicting more negative outcomes. Quick returns did not predict recovery after rest days. Frequency of night work did not predict any of the outcomes. In conclusion, quick returns were an important determinant of sleep, recovery and wellbeing, whereas night work did not show such an association.

  2. Language Simulations: The Blending Space for Writing and Critical Thinking

    ERIC Educational Resources Information Center

    Kovalik, Doina L.; Kovalik, Ludovic M.

    2007-01-01

    This article describes a language simulation involving six distinct phases: an in-class quick response, a card game, individual research, a classroom debate, a debriefing session, and an argumentative essay. An analysis of student artifacts--quick-response writings and final essays, respectively, both addressing the definition of liberty in a…

  3. Towards a numerical run-out model for quick-clay slides

    NASA Astrophysics Data System (ADS)

    Issler, Dieter; L'Heureux, Jean-Sébastien; Cepeda, José M.; Luna, Byron Quan; Gebreslassie, Tesfahunegn A.

    2015-04-01

    Highly sensitive glacio-marine clays occur in many relatively low-lying areas near the coasts of eastern Canada, Scandinavia and northern Russia. If the load exceeds the yield stress of these clays, they quickly liquefy, with a reduction of the yield strength and the viscosity by several orders of magnitude. Leaching, fluvial erosion, earthquakes and man-made overloads, by themselves or combined, are the most frequent triggers of quick-clay slides, which are hard to predict and can attain catastrophic dimensions. The present contribution reports on two preparatory studies that were conducted with a view to creating a run-out model tailored to the characteristics of quick-clay slides. One study analyzed the connections between the morphological and geotechnical properties of more than 30 well-documented Norwegian quick-clay slides and their run-out behavior. The laboratory experiments by Locat and Demers (1988) suggest that the behavior of quick clays can be reasonably described by universal relations involving the liquidity index, plastic index, remolding energy, salinity and sensitivity. However, these tests should be repeated with Norwegian clays and analyzed in terms of a (shear-thinning) Herschel-Bulkley fluid rather than a Bingham fluid because the shear stress appears to grow in a sub-linear fashion with the shear rate. Further study is required to understand the discrepancy between the material parameters obtained in laboratory tests of material from observed slides and in back-calculations of the same slides with the simple model by Edgers & Karlsrud (1982). The second study assessed the capability of existing numerical flow models to capture the most important aspects of quick-clay slides by back-calculating three different, well documented events in Norway: Rissa (1978), Finneidfjord (1996) and Byneset (2012). The numerical codes were (i) BING, a quasi-two-dimensional visco-plastic model, (ii) DAN3D (2009 version), and (iii) MassMov2D. The latter two are quasi-three-dimensional codes with a choice of bed-friction laws. The findings of the simulations point strongly towards the need for a different modeling approach that incorporates the essential physical features of quick-clay slides. The major requirement is a realistic description of remolding. A two-layer model is needed to describe the non-sensitive topsoil that often is passively advected by the slide. In many cases, the topography is rather complex so that 3D or quasi-3D (depth-averaged) models are required for realistic modeling of flow heights and velocities. Finally, since many Norwegian quick-clay slides run-out in a fjord (and may generate a tsunami), it is also desirable to explicitly account for buoyancy and hydrodynamic drag.

  4. LittleQuickWarp: an ultrafast image warping tool.

    PubMed

    Qu, Lei; Peng, Hanchuan

    2015-02-01

    Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Fast steering and quick positioning of large field-of-regard, two-axis, four-gimbaled sight

    NASA Astrophysics Data System (ADS)

    Ansari, Zahir Ahmed; Nigam, Madhav Ji; Kumar, Avnish

    2017-07-01

    Fast steering and quick positioning are prime requirements of the current electro-optical tracking system to achieve quick target acquisition. A scheme has been proposed for realizing these features using two-axis, four-gimbaled sight. For steering the line of sight in the stabilization mode, outer gimbal is slaved to the gyro stabilized inner gimbal. Typically, the inner gimbals have direct drives and outer gimbals have geared drives, which result in a mismatch in the acceleration capability of their servo loops. This limits the allowable control bandwidth for the inner gimbal. However, to achieve high stabilization accuracy, high bandwidth control loops are essential. This contradictory requirement has been addressed by designing a suitable command conditioning module for the inner gimbals. Also, large line-of-sight freedom in pitch axis is required to provide a wide area surveillance capacity for airborne application. This leads to a loss of freedom along the yaw axis as the pitch angle goes beyond 70 deg or so. This is addressed by making the outer gimbal master after certain pitch angle. Moreover, a mounting scheme for gyro has been proposed to accomplish yaw axis stabilization for 110-deg pitch angle movement with a single two-axis gyro.

  6. MESAFace, a graphical interface to analyze the MESA output

    NASA Astrophysics Data System (ADS)

    Giannotti, M.; Wise, M.; Mohammed, A.

    2013-04-01

    MESA (Modules for Experiments in Stellar Astrophysics) has become very popular among astrophysicists as a powerful and reliable code to simulate stellar evolution. Analyzing the output data thoroughly may, however, present some challenges and be rather time-consuming. Here we describe MESAFace, a graphical and dynamical interface which provides an intuitive, efficient and quick way to analyze the MESA output. Catalogue identifier: AEOQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOQ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 19165 No. of bytes in distributed program, including test data, etc.: 6300592 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer capable of running Mathematica. Operating system: Any capable of running Mathematica. Tested on Linux, Mac, Windows XP, Windows 7. RAM: Recommended 2 Gigabytes or more. Supplementary material: Additional test data files are available. Classification: 1.7, 14. Nature of problem: Find a way to quickly and thoroughly analyze the output of a MESA run, including all the profiles, and have an efficient method to produce graphical representations of the data. Solution method: We created two scripts (to be run consecutively). The first one downloads all the data from a MESA run and organizes the profiles in order of age. All the files are saved as tables or arrays of tables which can then be accessed very quickly by Mathematica. The second script uses the Manipulate function to create a graphical interface which allows the user to choose what to plot from a set of menus and buttons. The information shown is updated in real time. The user can access very quickly all the data from the run under examination and visualize it with plots and tables. Unusual features: Moving the slides in certain regions may cause an error message. This happens when Mathematica is asked to read nonexistent data. The error message, however, disappears when the slides are moved back. This issue does not preclude the good functioning of the interface. Additional comments: The program uses the dynamical capabilities of Mathematica. When the program is opened, Mathematica prompts the user to “Enable Dynamics”. It is necessary to accept before proceeding. Running time: Depends on the size of the data downloaded, on where the data are stored (hard-drive or web), and on the speed of the computer or network connection. In general, downloading the data may take from a minute to several minutes. Loading directly from the web is slower. For example, downloading a 200 MB data folder (a total of 102 files) with a dual-core Intel laptop, P8700, 2 GB of RAM, at 2.53 GHz took about a minute from the hard-drive and about 23 min from the web (with a basic home wireless connection).

  7. Autonomy enables new science missions

    NASA Astrophysics Data System (ADS)

    Doyle, Richard J.; Gor, Victoria; Man, Guy K.; Stolorz, Paul E.; Chapman, Clark; Merline, William J.; Stern, Alan

    1997-01-01

    The challenge of space flight in NASA's future is to enable smaller, more frequent and intensive space exploration at much lower total cost without substantially decreasing mission reliability, capability, or the scientific return on investment. The most effective way to achieve this goal is to build intelligent capabilities into the spacecraft themselves. Our technological vision for meeting the challenge of returning quality science through limited communication bandwidth will actually put scientists in a more direct link with the spacecraft than they have enjoyed to date. Technologies such as pattern recognition and machine learning can place a part of the scientist's awareness onboard the spacecraft to prioritize downlink or to autonomously trigger time-critical follow-up observations-particularly important in flyby missions-without ground interaction. Onboard knowledge discovery methods can be used to include candidate discoveries in each downlink for scientists' scrutiny. Such capabilities will allow scientists to quickly reprioritize missions in a much more intimate and efficient manner than is possible today. Ultimately, new classes of exploration missions will be enabled.

  8. A demonstration of motion base design alternatives for the National Advanced Driving Simulator

    NASA Technical Reports Server (NTRS)

    Mccauley, Michael E.; Sharkey, Thomas J.; Sinacori, John B.; Laforce, Soren; Miller, James C.; Cook, Anthony

    1992-01-01

    A demonstration of the capability of NASA's Vertical Motion Simulator to simulate two alternative motion base designs for the National Advanced Driving simulator (NADS) is reported. The VMS is located at ARC. The motion base conditions used in this demonstration were as follows: (1) a large translational motion base; and (2) a motion base design with limited translational capability. The latter had translational capability representative of a typical synergistic motion platform. These alternatives were selected to test the prediction that large amplitude translational motion would result in a lower incidence or severity of simulator induced sickness (SIS) than would a limited translational motion base. A total of 10 drivers performed two tasks, slaloms and quick-stops, using each of the motion bases. Physiological, objective, and subjective measures were collected. No reliable differences in SIS between the motion base conditions was found in this demonstration. However, in light of the cost considerations and engineering challenges associated with implementing a large translation motion base, performance of a formal study is recommended.

  9. TADPLOT program, version 2.0: User's guide

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.

    1991-01-01

    The TADPLOT Program, Version 2.0 is described. The TADPLOT program is a software package coordinated by a single, easy-to-use interface, enabling the researcher to access several standard file formats, selectively collect specific subsets of data, and create full-featured publication and viewgraph quality plots. The user-interface was designed to be independent from any file format, yet provide capabilities to accommodate highly specialized data queries. Integrated with an applications software network, data can be assessed, collected, and viewed quickly and easily. Since the commands are data independent, subsequent modifications to the file format will be transparent, while additional file formats can be integrated with minimal impact on the user-interface. The graphical capabilities are independent of the method of data collection; thus, the data specification and subsequent plotting can be modified and upgraded as separate functional components. The graphics kernel selected adheres to the full functional specifications of the CORE standard. Both interface and postprocessing capabilities are fully integrated into TADPLOT.

  10. Full Mesh Audio Conferencing Using the Point-to-Multipoint On-Board Switching Capability of ACTS

    NASA Technical Reports Server (NTRS)

    Rivett, Mary L.; Sethna, Zubin H.

    1996-01-01

    The purpose of this paper is to describe an implementation of audio conferencing using the ACTS T1-VSAT network. In particular, this implementation evaluates the use of the on-board switching capability of the satellite as a viable alternative for providing the multipoint connectivity normally provided by terrestrial audio bridge equipment The system that was implemented provides full mesh, full-duplex audio conferencing, with end-to-end voice paths between all participants requiring only a single hop (i.e. 250 msec. delay). Moreover, it addresses the lack of spontaneity in current systems by allowing a user to easily start a conference from any standard telephone handset connected to an ACTS earth station, and quickly add new members to the conference at any time using the 'hook flash' capability. No prior scheduling of resources is required and there is no central point of control, thereby providing the user with the spontaneity desired in audio conference control.

  11. The EBI search engine: EBI search as a service-making biological data accessible for all.

    PubMed

    Park, Young M; Squizzato, Silvano; Buso, Nicola; Gur, Tamer; Lopez, Rodrigo

    2017-07-03

    We present an update of the EBI Search engine, an easy-to-use fast text search and indexing system with powerful data navigation and retrieval capabilities. The interconnectivity that exists between data resources at EMBL-EBI provides easy, quick and precise navigation and a better understanding of the relationship between different data types that include nucleotide and protein sequences, genes, gene products, proteins, protein domains, protein families, enzymes and macromolecular structures, as well as the life science literature. EBI Search provides a powerful RESTful API that enables its integration into third-party portals, thus providing 'Search as a Service' capabilities, which are the main topic of this article. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    NASA Technical Reports Server (NTRS)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  13. X-40A releasing from the strongback during Free Flight #2A. Both are attached by tether line to the CH-47

    NASA Image and Video Library

    2001-04-12

    Second free-flight of the X-40A at the NASA Dryden Flight Research Center, on Edwards AFB, Calif., was made on Apr. 12, 2001. The unpowered X-40A, an 85 percent scale risk reduction version of the proposed X-37, is proving the capability of an autonomous flight control and landing system in a series of glide flights at Edwards. The April 12 flight introduced complex vehicle maneuvers during the landing sequence. The X-40A was released from an Army Chinook helicopter flying 15,050 feet overhead. Ultimately, the unpiloted X-37 is intended as an orbital testbed and technology demonstrator, capable of landing like an airplane and being quickly serviced for a follow-up mission.

  14. [Identification of varieties of cashmere by Vis/NIR spectroscopy technology based on PCA-SVM].

    PubMed

    Wu, Gui-Fang; He, Yong

    2009-06-01

    One mixed algorithm was presented to discriminate cashmere varieties with principal component analysis (PCA) and support vector machine (SVM). Cashmere fiber has such characteristics as threadlike, softness, glossiness and high tensile strength. The quality characters and economic value of each breed of cashmere are very different. In order to safeguard the consumer's rights and guarantee the quality of cashmere product, quickly, efficiently and correctly identifying cashmere has significant meaning to the production and transaction of cashmere material. The present research adopts Vis/NIRS spectroscopy diffuse techniques to collect the spectral data of cashmere. The near infrared fingerprint of cashmere was acquired by principal component analysis (PCA), and support vector machine (SVM) methods were used to further identify the cashmere material. The result of PCA indicated that the score map made by the scores of PC1, PC2 and PC3 was used, and 10 principal components (PCs) were selected as the input of support vector machine (SVM) based on the reliabilities of PCs of 99.99%. One hundred cashmere samples were used for calibration and the remaining 75 cashmere samples were used for validation. A one-against-all multi-class SVM model was built, the capabilities of SVM with different kernel function were comparatively analyzed, and the result showed that SVM possessing with the Gaussian kernel function has the best identification capabilities with the accuracy of 100%. This research indicated that the data mining method of PCA-SVM has a good identification effect, and can work as a new method for rapid identification of cashmere material varieties.

  15. New technologies for advanced three-dimensional optimum shape design in aeronautics

    NASA Astrophysics Data System (ADS)

    Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno

    1999-05-01

    The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright

  16. [Discriminant Analysis of Lavender Essential Oil by Attenuated Total Reflectance Infrared Spectroscopy].

    PubMed

    Tang, Jun; Wang, Qing; Tong, Hong; Liao, Xiang; Zhang, Zheng-fang

    2016-03-01

    This work aimed to use attenuated total reflectance Fourier transform infrared spectroscopy to identify the lavender essential oil by establishing a Lavender variety and quality analysis model. So, 96 samples were tested. For all samples, the raw spectra were pretreated as second derivative, and to determine the 1 750-900 cm(-1) wavelengths for pattern recognition analysis on the basis of the variance calculation. The results showed that principal component analysis (PCA) can basically discriminate lavender oil cultivar and the first three principal components mainly represent the ester, alcohol and terpenoid substances. When the orthogonal partial least-squares discriminant analysis (OPLS-DA) model was established, the 68 samples were used for the calibration set. Determination coefficients of OPLS-DA regression curve were 0.959 2, 0.976 4, and 0.958 8 respectively for three varieties of lavender essential oil. Three varieties of essential oil's the root mean square error of prediction (RMSEP) in validation set were 0.142 9, 0.127 3, and 0.124 9, respectively. The discriminant rate of calibration set and the prediction rate of validation set had reached 100%. The model has the very good recognition capability to detect the variety and quality of lavender essential oil. The result indicated that a model which provides a quick, intuitive and feasible method had been built to discriminate lavender oils.

  17. Applications of Laser-Induced Breakdown Spectroscopy (LIBS) in Molten Metal Processing

    NASA Astrophysics Data System (ADS)

    Hudson, Shaymus W.; Craparo, Joseph; De Saro, Robert; Apelian, Diran

    2017-10-01

    In order for metals to meet the demand for critical applications in the automotive, aerospace, and defense industries, tight control over the composition and cleanliness of the metal must be achieved. The use of laser-induced breakdown spectroscopy (LIBS) for applications in metal processing has generated significant interest for its ability to perform quick analyses in situ. The fundamentals of LIBS, current techniques for deployment on molten metal, demonstrated capabilities, and possible avenues for development are reviewed and discussed.

  18. Instrumentation to Support the Research Program of Pulsed Laser Deposition of Polymer Nanocomposite Films

    DTIC Science & Technology

    2015-05-19

    ablated the targets in the same spots during the PLD process. The beams quickly created craters in these spots. That led to cracks and rapid (in...nanoparticles in the near-IR range taken with the newly acquired (with the support from the DoD Grant) UV -VIS-NIR Spectrophotometer Cary from Varian. As...reagent film has the capability of recovering from the exposure to very high ammonia concentrations without experiencing any irreversible damage . Based on

  19. A Survey of Data-Base Information Systems Relevant to Navy Requirements Planning

    DTIC Science & Technology

    1983-02-01

    SHIPS \\ AK (FEM) T-AK (FEM) AKD/T-AKO _" ’ AKL/T-AKL AKM MULTIPURPOSE CAR 0 SHI’’S AKR VEHICLE CARGO SHIPS . -■, AK3 ANL AO OILER AC • NEW...the most demanding condition of operation for which a ship must be manned. ( a ) At sea in wartime. (b) Capable of performing all offensive... ship , and aircraft) researchers and others could quickly obtain basic information. 3. The Navy currently maintains a number of related

  20. Multi-Robot Systems in Military Domains (Les Systemes Multi-Robots Dans les Domaines Militaires)

    DTIC Science & Technology

    2008-12-01

    to allow him to react quickly to improve his personal safety , it is mandatory to shorten the current very long delay needed for the human operator to...Hard RT tasks 2 OS / API Process monitoring 3 H / API Flexible communication medium 4 H / API Networking capabilities 5 H / API Safety 6 API...also be considered between high level services and legacy systems. 4) This is the one of the basic requirement for CoRoDe. 5) Safety : CRC, Timeouts

  1. New design and new challenge for space large ultralightweight and stable Zerodur© mirror for future high resolution observation instruments

    NASA Astrophysics Data System (ADS)

    Devilliers, C.; Du Jeu, C.; Costes, V.; Suau, A.; Girault, N.; Cornillon, L.

    2017-11-01

    Space telescopes pupil diameter increases continuously to reach higher resolutions and associated optical scheme become more sensitive. As a consequence the size of these telescopes but also their stability requirements increase. Therefore, mass of space telescopes becomes a strong design driver to be still compatible with price competitive launcher capabilities. Moreover satellite agility requirements are more and more severe and instruments shall be compatible with quick evolution of thermal environment.

  2. User definition and mission requirements for unmanned airborne platforms, revised

    NASA Technical Reports Server (NTRS)

    Kuhner, M. B.; Mcdowell, J. R.

    1979-01-01

    The airborne measurement requirements of the scientific and applications experiment user community were assessed with respect to the suitability of proposed strawman airborne platforms. These platforms provide a spectrum of measurement capabilities supporting associated mission tradeoffs such as payload weight, operating altitude, range, duration, flight profile control, deployment flexibility, quick response, and recoverability. The results of the survey are used to examine whether the development of platforms is warranted and to determine platform system requirements as well as research and technology needs.

  3. Identification of two-phase flow regime based on electrical capacitance tomography and soft-sensing technique

    NASA Astrophysics Data System (ADS)

    Zhao, Ming-fu; Hu, Xin-Yu; Shao, Yun; Luo, Bin-bin; Wang, Xin

    2008-10-01

    This article analyses nowadays in common use of football robots in China, intended to improve the football robots' hardware platform system's capability, and designed a football robot which based on DSP core controller, and combined Fuzzy-PID control algorithm. The experiment showed, because of the advantages of DSP, such as quickly operation, various of interfaces, low power dissipation etc. It has great improvement on the football robot's performance of movement, controlling precision, real-time performance.

  4. New procedure speeds cold start, protects turbine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mallard, R.E.; Jordan, C.A.

    1995-09-01

    System dispatch from today`s power plants must consider availability of purchase power (buy and sell), fuel prices, and unit availability and efficiency. To gain the best combination of these factors, steam units must be capable of quick removal and return to service. However, unit startups are expensive, time consuming nd operationally demanding. For example, excessive thermal stresses can be catastrophic to a unit. With those factors in mind, Jacksonville Electric Authority (JEA) developed the ``valve open start`` procedure described here.

  5. Data-Base Software For Tracking Technological Developments

    NASA Technical Reports Server (NTRS)

    Aliberti, James A.; Wright, Simon; Monteith, Steve K.

    1996-01-01

    Technology Tracking System (TechTracS) computer program developed for use in storing and retrieving information on technology and related patent information developed under auspices of NASA Headquarters and NASA's field centers. Contents of data base include multiple scanned still images and quick-time movies as well as text. TechTracS includes word-processing, report-editing, chart-and-graph-editing, and search-editing subprograms. Extensive keyword searching capabilities enable rapid location of technologies, innovators, and companies. System performs routine functions automatically and serves multiple users.

  6. Manufacturing Technology Support (MATES II) Task Order 0005: Manufacturing Integration and Technology Evaluation to Enable Technology Transition. Subtask Phase 0 Study Task: Manufacturing Technology (ManTech) and Systems Engineering For Quick Reaction Systems

    DTIC Science & Technology

    2014-10-01

    Porosity from gas entrapment & shrinkage 4 Continuous Fiber Ti Metal Matrix Composites (Aircraft panels and rotor components) [14...process models for casting, forging, and welding , and software capability to integrate various independent models with design, thermal, and structural...Applications, Ph.D. Thesis, Queen’s College, University of Oxford, (2007). 14. S.A. Singerman and J.J. Jackson, Titanium Metal Matrix Composites for

  7. Development of a weight/sizing design synthesis computer program. Volume 1: Program formulation

    NASA Technical Reports Server (NTRS)

    Garrison, J. M.

    1973-01-01

    The development of a weight/sizing design synthesis methodology for use in support of the main line space shuttle program is discussed. The methodology has a minimum number of data inputs and quick turn around capabilities. The methodology makes it possible to: (1) make weight comparisons between current shuttle configurations and proposed changes, (2) determine the effects of various subsystems trades on total systems weight, and (3) determine the effects of weight on performance and performance on weight.

  8. An Approach to Providing a User Interface for Military Computer-Aided- Instruction in 1980

    DTIC Science & Technology

    1975-11-01

    commercial terminals is the use of a microprocessor unit ( MPU ) LSI chip controller. This technology is flexible and economical •nd can be expected to...various «•gmentt. By using an MPU and developing a software capability, tha vendor can quickly and economically satisfy a large spsctrum of user...the basis for an effective and economical jser interface to military CAI systems. •a. sicumrv CLAMincATioH or THIS P**;:^*— D*. K*fn4) ^vmm m m r

  9. Care 3 model overview and user's guide, first revision

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Petersen, P. L.

    1985-01-01

    A manual was written to introduce the CARE III (Computer-Aided Reliability Estimation) capability to reliability and design engineers who are interested in predicting the reliability of highly reliable fault-tolerant systems. It was also structured to serve as a quick-look reference manual for more experienced users. The guide covers CARE III modeling and reliability predictions for execution in the CDC CYber 170 series computers, DEC VAX-11/700 series computer, and most machines that compile ANSI Standard FORTRAN 77.

  10. Rebalance to the Pacific: A Case for Greater Amphibious Capabilities in the US Army

    DTIC Science & Technology

    2015-05-21

    gas and 200 billion barrels of oil potentially exist.73 Currently, the United Nations Convention on the Law of the Sea (UNCLOS) establishes an...are highly destructive, thus enabling a quick political decision. The doctrine of “Local War under Conditions of Informatization ” codifies the PLA’s...of Informatization ,” the PLA is improving their ability to operate as a joint force. In 2013, the PLA conducted in a series of joint exercises

  11. Operation Market Garden: Case Study for Analyzing Senior Leader Responsibilities

    DTIC Science & Technology

    2009-05-04

    late-July 1944 Brest Undetermined Seize ports TRANSFIGURE 17 August 1944 Paris - Orleans gap 101st (US), 1st (UK), Polish BDE Trap 7th Army (German...committed to more than one full lift per day. Had troop carrier forces been committed as was originally intended, i.e., to make a quick turn around to...mission assigned to us in the original plan.”28 While his airborne divisions fought as hard and held out as long as they were capable of doing, their

  12. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user community comes new requests for algorithms and processing capabilities. To address this demand, OT is developing an extensible service based architecture for integrating community-developed software. This "plugable" approach to Web service deployment will enable new processing and analysis tools to run collocated with OT hosted data.

  13. Federal Disability Terms: A Review of State Use. Quick Turn Around (QTA).

    ERIC Educational Resources Information Center

    Muller, Eve; Linehan, Patrice

    This Quick Turn Around issue analysis summarizes information gathered by Project FORUM on the disability terms used by state education agencies (SEAs). All 50 states and 6 non-state jurisdictions returned completed surveys between February and April 2001. Of the 56 respondents, 18 SEAs report having aligned their terminology completely with the 12…

  14. Direct optical mapping of transcription factor binding sites on field-stretched λ-DNA in nanofluidic devices

    PubMed Central

    Sriram, K. K.; Yeh, Jia-Wei; Lin, Yii-Lih; Chang, Yi-Ren; Chou, Chia-Fu

    2014-01-01

    Mapping transcription factor (TF) binding sites along a DNA backbone is crucial in understanding the regulatory circuits that control cellular processes. Here, we deployed a method adopting bioconjugation, nanofluidic confinement and fluorescence single molecule imaging for direct mapping of TF (RNA polymerase) binding sites on field-stretched single DNA molecules. Using this method, we have mapped out five of the TF binding sites of E. coli RNA polymerase to bacteriophage λ-DNA, where two promoter sites and three pseudo-promoter sites are identified with the corresponding binding frequency of 45% and 30%, respectively. Our method is quick, robust and capable of resolving protein-binding locations with high accuracy (∼ 300 bp), making our system a complementary platform to the methods currently practiced. It is advantageous in parallel analysis and less prone to false positive results over other single molecule mapping techniques such as optical tweezers, atomic force microscopy and molecular combing, and could potentially be extended to general mapping of protein–DNA interaction sites. PMID:24753422

  15. New Python-based methods for data processing

    PubMed Central

    Sauter, Nicholas K.; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel

    2013-01-01

    Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h−1) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femto­second crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units. PMID:23793153

  16. Input-output model for MACCS nuclear accident impacts estimation¹

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domesticmore » product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.« less

  17. Mission Analysis Program for Solar Electric Propulsion (MAPSEP). Volume 1: Analytical manual for earth orbital MAPSEP

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An introduction to the MAPSEP organization and a detailed analytical description of all models and algorithms are given. These include trajectory and error covariance propagation methods, orbit determination processes, thrust modeling, and trajectory correction (guidance) schemes. Earth orbital MAPSEP contains the capability of analyzing almost any currently projected low thrust mission from low earth orbit to super synchronous altitudes. Furthermore, MAPSEP is sufficiently flexible to incorporate extended dynamic models, alternate mission strategies, and almost any other system requirement imposed by the user. As in the interplanetary version, earth orbital MAPSEP represents a trade-off between precision modeling and computational speed consistent with defining necessary system requirements. It can be used in feasibility studies as well as in flight operational support. Pertinent operational constraints are available both implicitly and explicitly. However, the reader should be warned that because of program complexity, MAPSEP is only as good as the user and will quickly succumb to faulty user inputs.

  18. An interactive environment for the analysis of large Earth observation and model data sets

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.; Walsh, John E.; Wilhelmson, Robert B.

    1994-01-01

    Envision is an interactive environment that provides researchers in the earth sciences convenient ways to manage, browse, and visualize large observed or model data sets. Its main features are support for the netCDF and HDF file formats, an easy to use X/Motif user interface, a client-server configuration, and portability to many UNIX workstations. The Envision package also provides new ways to view and change metadata in a set of data files. It permits a scientist to conveniently and efficiently manage large data sets consisting of many data files. It also provides links to popular visualization tools so that data can be quickly browsed. Envision is a public domain package, freely available to the scientific community. Envision software (binaries and source code) and documentation can be obtained from either of these servers: ftp://vista.atmos.uiuc.edu/pub/envision/ and ftp://csrp.tamu.edu/pub/envision/. Detailed descriptions of Envision capabilities and operations can be found in the User's Guide and Reference Manuals distributed with Envision software.

  19. Scalable multi-sample single-cell data analysis by Partition-Assisted Clustering and Multiple Alignments of Networks

    PubMed Central

    Samusik, Nikolay; Wang, Xiaowei; Guan, Leying; Nolan, Garry P.

    2017-01-01

    Mass cytometry (CyTOF) has greatly expanded the capability of cytometry. It is now easy to generate multiple CyTOF samples in a single study, with each sample containing single-cell measurement on 50 markers for more than hundreds of thousands of cells. Current methods do not adequately address the issues concerning combining multiple samples for subpopulation discovery, and these issues can be quickly and dramatically amplified with increasing number of samples. To overcome this limitation, we developed Partition-Assisted Clustering and Multiple Alignments of Networks (PAC-MAN) for the fast automatic identification of cell populations in CyTOF data closely matching that of expert manual-discovery, and for alignments between subpopulations across samples to define dataset-level cellular states. PAC-MAN is computationally efficient, allowing the management of very large CyTOF datasets, which are increasingly common in clinical studies and cancer studies that monitor various tissue samples for each subject. PMID:29281633

  20. Mobile cosmetics advisor: an imaging based mobile service

    NASA Astrophysics Data System (ADS)

    Bhatti, Nina; Baker, Harlyn; Chao, Hui; Clearwater, Scott; Harville, Mike; Jain, Jhilmil; Lyons, Nic; Marguier, Joanna; Schettino, John; Süsstrunk, Sabine

    2010-01-01

    Selecting cosmetics requires visual information and often benefits from the assessments of a cosmetics expert. In this paper we present a unique mobile imaging application that enables women to use their cell phones to get immediate expert advice when selecting personal cosmetic products. We derive the visual information from analysis of camera phone images, and provide the judgment of the cosmetics specialist through use of an expert system. The result is a new paradigm for mobile interactions-image-based information services exploiting the ubiquity of camera phones. The application is designed to work with any handset over any cellular carrier using commonly available MMS and SMS features. Targeted at the unsophisticated consumer, it must be quick and easy to use, not requiring download capabilities or preplanning. Thus, all application processing occurs in the back-end system and not on the handset itself. We present the imaging pipeline technology and a comparison of the services' accuracy with respect to human experts.

  1. The LBT real-time based control software to mitigate and compensate vibrations

    NASA Astrophysics Data System (ADS)

    Borelli, J.; Trowitzsch, J.; Brix, M.; Kürster, M.; Gässler, W.; Bertram, T.; Briegel, F.

    2010-07-01

    The Large Binocular Telescope (LBT) uses two 8.4 meters active primary mirrors and two adaptive secondary mirrors on the same mounting to take advantage of its interferometric capabilities. Both applications, interferometry and AO, are sensitive to vibrations. Several measurement campaigns have been carried out at the LBT and their results strongly indicate that a vibration monitoring system is required to improve the performance of LINC-NIRVANA, LBTI, and ARGOS, the laser guided ground layer adaptive optic system. Currently, a control software for mitigation and compensation of the vibrations is being designed. A complex set of algorithms collects real-time vibration data, archiving it for further analysis, and in parallel, generating the tip-tilt and optical path difference (OPD) data for the control loop of the instruments. A real-time data acquisition device equipped with embedded real-time Linux is used in our systems. A set of quick-look tools is currently under development in order to verify if the conditions at the telescope are suitable for interferometric/adaptive observations.

  2. A random distribution reacting mixing layer model

    NASA Technical Reports Server (NTRS)

    Jones, Richard A.; Marek, C. John; Myrabo, Leik N.; Nagamatsu, Henry T.

    1994-01-01

    A methodology for simulation of molecular mixing, and the resulting velocity and temperature fields has been developed. The ideas are applied to the flow conditions present in the NASA Lewis Research Center Planar Reacting Shear Layer (PRSL) facility, and results compared to experimental data. A gaussian transverse turbulent velocity distribution is used in conjunction with a linearly increasing time scale to describe the mixing of different regions of the flow. Equilibrium reaction calculations are then performed on the mix to arrive at a new species composition and temperature. Velocities are determined through summation of momentum contributions. The analysis indicates a combustion efficiency of the order of 80 percent for the reacting mixing layer, and a turbulent Schmidt number of 2/3. The success of the model is attributed to the simulation of large-scale transport of fluid. The favorable comparison shows that a relatively quick and simple PC calculation is capable of simulating the basic flow structure in the reacting and nonreacting shear layer present in the facility given basic assumptions about turbulence properties.

  3. HIV-1 vaccines

    PubMed Central

    Excler, Jean-Louis; Robb, Merlin L; Kim, Jerome H

    2014-01-01

    The development of a safe and effective preventive HIV-1 vaccine remains a public health priority. Despite scientific difficulties and disappointing results, HIV-1 vaccine clinical development has, for the first time, established proof-of-concept efficacy against HIV-1 acquisition and identified vaccine-associated immune correlates of risk. The correlate of risk analysis showed that IgG antibodies against the gp120 V2 loop correlated with decreased risk of HIV infection, while Env-specific IgA directly correlated with increased risk. The development of vaccine strategies such as improved envelope proteins formulated with potent adjuvants and DNA and vectors expressing mosaics, or conserved sequences, capable of eliciting greater breadth and depth of potentially relevant immune responses including neutralizing and non-neutralizing antibodies, CD4+ and CD8+ cell-mediated immune responses, mucosal immune responses, and immunological memory, is now proceeding quickly. Additional human efficacy trials combined with other prevention modalities along with sustained funding and international collaboration remain key to bring an HIV-1 vaccine to licensure. PMID:24637946

  4. Conceptual design study of a Harrier V/STOL research aircraft

    NASA Technical Reports Server (NTRS)

    Bode, W. E.; Berger, R. L.; Elmore, G. A.; Lacey, T. R.

    1978-01-01

    MCAIR recently completed a conceptual design study to define modification approaches to, and derive planning prices for the conversion of a two place Harrier to a V/STOL control, display and guidance research aircraft. Control concepts such as rate damping, attitude stabilization, velocity command, and cockpit controllers are to be demonstrated. Display formats will also be investigated, and landing, navigation and guidance systems flight tested. The rear cockpit is modified such that it can be quickly adapted to faithfully simulate the controls, displays and handling qualities of a Type A or Type B V/STOL. The safety pilot always has take command capability. The modifications studied fall into two categories: basic modifications and optional modifications. Technical descriptions of the basic modifications and of the optional modifications are presented. The modification plan and schedule as well as the test plan and schedule are presented. The failure mode and effects analysis, aircraft performance, aircraft weight, and aircraft support are discussed.

  5. QuickNGS elevates Next-Generation Sequencing data analysis to a new level of automation.

    PubMed

    Wagle, Prerana; Nikolić, Miloš; Frommolt, Peter

    2015-07-01

    Next-Generation Sequencing (NGS) has emerged as a widely used tool in molecular biology. While time and cost for the sequencing itself are decreasing, the analysis of the massive amounts of data remains challenging. Since multiple algorithmic approaches for the basic data analysis have been developed, there is now an increasing need to efficiently use these tools to obtain results in reasonable time. We have developed QuickNGS, a new workflow system for laboratories with the need to analyze data from multiple NGS projects at a time. QuickNGS takes advantage of parallel computing resources, a comprehensive back-end database, and a careful selection of previously published algorithmic approaches to build fully automated data analysis workflows. We demonstrate the efficiency of our new software by a comprehensive analysis of 10 RNA-Seq samples which we can finish in only a few minutes of hands-on time. The approach we have taken is suitable to process even much larger numbers of samples and multiple projects at a time. Our approach considerably reduces the barriers that still limit the usability of the powerful NGS technology and finally decreases the time to be spent before proceeding to further downstream analysis and interpretation of the data.

  6. Condensing Massive Satellite Datasets For Rapid Interactive Analysis

    NASA Astrophysics Data System (ADS)

    Grant, G.; Gallaher, D. W.; Lv, Q.; Campbell, G. G.; Fowler, C.; LIU, Q.; Chen, C.; Klucik, R.; McAllister, R. A.

    2015-12-01

    Our goal is to enable users to interactively analyze massive satellite datasets, identifying anomalous data or values that fall outside of thresholds. To achieve this, the project seeks to create a derived database containing only the most relevant information, accelerating the analysis process. The database is designed to be an ancillary tool for the researcher, not an archival database to replace the original data. This approach is aimed at improving performance by reducing the overall size by way of condensing the data. The primary challenges of the project include: - The nature of the research question(s) may not be known ahead of time. - The thresholds for determining anomalies may be uncertain. - Problems associated with processing cloudy, missing, or noisy satellite imagery. - The contents and method of creation of the condensed dataset must be easily explainable to users. The architecture of the database will reorganize spatially-oriented satellite imagery into temporally-oriented columns of data (a.k.a., "data rods") to facilitate time-series analysis. The database itself is an open-source parallel database, designed to make full use of clustered server technologies. A demonstration of the system capabilities will be shown. Applications for this technology include quick-look views of the data, as well as the potential for on-board satellite processing of essential information, with the goal of reducing data latency.

  7. Local Observability Analysis of Star Sensor Installation Errors in a SINS/CNS Integration System for Near-Earth Flight Vehicles

    PubMed Central

    Yang, Yanqiang; Zhang, Chunxi; Lu, Jiazhen

    2017-01-01

    Strapdown inertial navigation system/celestial navigation system (SINS/CNS) integrated navigation is a fully autonomous and high precision method, which has been widely used to improve the hitting accuracy and quick reaction capability of near-Earth flight vehicles. The installation errors between SINS and star sensors have been one of the main factors that restrict the actual accuracy of SINS/CNS. In this paper, an integration algorithm based on the star vector observations is derived considering the star sensor installation error. Then, the star sensor installation error is accurately estimated based on Kalman Filtering (KF). Meanwhile, a local observability analysis is performed on the rank of observability matrix obtained via linearization observation equation, and the observable conditions are presented and validated. The number of star vectors should be greater than or equal to 2, and the times of posture adjustment also should be greater than or equal to 2. Simulations indicate that the star sensor installation error could be readily observable based on the maneuvering condition; moreover, the attitude errors of SINS are less than 7 arc-seconds. This analysis method and conclusion are useful in the ballistic trajectory design of near-Earth flight vehicles. PMID:28275211

  8. The Omics Dashboard for interactive exploration of gene-expression data.

    PubMed

    Paley, Suzanne; Parker, Karen; Spaulding, Aaron; Tomb, Jean-Francois; O'Maille, Paul; Karp, Peter D

    2017-12-01

    The Omics Dashboard is a software tool for interactive exploration and analysis of gene-expression datasets. The Omics Dashboard is organized as a hierarchy of cellular systems. At the highest level of the hierarchy the Dashboard contains graphical panels depicting systems such as biosynthesis, energy metabolism, regulation and central dogma. Each of those panels contains a series of X-Y plots depicting expression levels of subsystems of that panel, e.g. subsystems within the central dogma panel include transcription, translation and protein maturation and folding. The Dashboard presents a visual read-out of the expression status of cellular systems to facilitate a rapid top-down user survey of how all cellular systems are responding to a given stimulus, and to enable the user to quickly view the responses of genes within specific systems of interest. Although the Dashboard is complementary to traditional statistical methods for analysis of gene-expression data, we show how it can detect changes in gene expression that statistical techniques may overlook. We present the capabilities of the Dashboard using two case studies: the analysis of lipid production for the marine alga Thalassiosira pseudonana, and an investigation of a shift from anaerobic to aerobic growth for the bacterium Escherichia coli. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. The Omics Dashboard for interactive exploration of gene-expression data

    PubMed Central

    Paley, Suzanne; Parker, Karen; Spaulding, Aaron; Tomb, Jean-Francois; O’Maille, Paul

    2017-01-01

    Abstract The Omics Dashboard is a software tool for interactive exploration and analysis of gene-expression datasets. The Omics Dashboard is organized as a hierarchy of cellular systems. At the highest level of the hierarchy the Dashboard contains graphical panels depicting systems such as biosynthesis, energy metabolism, regulation and central dogma. Each of those panels contains a series of X–Y plots depicting expression levels of subsystems of that panel, e.g. subsystems within the central dogma panel include transcription, translation and protein maturation and folding. The Dashboard presents a visual read-out of the expression status of cellular systems to facilitate a rapid top-down user survey of how all cellular systems are responding to a given stimulus, and to enable the user to quickly view the responses of genes within specific systems of interest. Although the Dashboard is complementary to traditional statistical methods for analysis of gene-expression data, we show how it can detect changes in gene expression that statistical techniques may overlook. We present the capabilities of the Dashboard using two case studies: the analysis of lipid production for the marine alga Thalassiosira pseudonana, and an investigation of a shift from anaerobic to aerobic growth for the bacterium Escherichia coli. PMID:29040755

  10. Innovative Near Real-Time Data Dissemination Tools Developed by the Space Weather Research Center

    NASA Astrophysics Data System (ADS)

    Maddox, Marlo M.; Mullinix, Richard; Mays, M. Leila; Kuznetsova, Maria; Zheng, Yihua; Pulkkinen, Antti; Rastaetter, Lutz

    2013-03-01

    Access to near real-time and real-time space weather data is essential to accurately specifying and forecasting the space environment. The Space Weather Research Center at NASA Goddard Space Flight Center's Space Weather Laboratory provides vital space weather forecasting services primarily to NASA robotic mission operators, as well as external space weather stakeholders including the Air Force Weather Agency. A key component in this activity is the iNtegrated Space Weather Analysis System which is a joint development project at NASA GSFC between the Space Weather Laboratory, Community Coordinated Modeling Center, Applied Engineering & Technology Directorate, and NASA HQ Office Of Chief Engineer. The iSWA system was developed to address technical challenges in acquiring and disseminating space weather environment information. A key design driver for the iSWA system was to generate and present vast amounts of space weather resources in an intuitive, user-configurable, and adaptable format - thus enabling users to respond to current and future space weather impacts as well as enabling post-impact analysis. Having access to near real-time and real-time data is essential to not only ensuring that relevant observational data is available for analysis - but also in ensuring that models can be driven with the requisite input parameters at proper and efficient temporal and spacial resolutions. The iSWA system currently manages over 300 unique near-real and real-time data feeds from various sources consisting of both observational and simulation data. A comprehensive suite of actionable space weather analysis tools and products are generated and provided utilizing a mixture of the ingested data - enabling new capabilities in quickly assessing past, present, and expected space weather effects. This paper will highlight current and future iSWA system capabilities including the utilization of data from the Solar Dynamics Observatory mission. http://iswa.gsfc.nasa.gov/

  11. D0 Silicon Upgrade: Lower Cleanroom Roof Quick Load Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rucinski, Russ; /Fermilab

    1995-11-17

    This engineering note documents calculations done to determine the margin of safety for the lower clean room roof. The analysis was done to give me a feeling of what the loads, stresses and capacity of the roof is prior to installation and installation work to be done for the helium refrigerator upgrade. The result of this quick look showed that the calculated loads produce stress values and loads at about half the allowables. Based on this result, I do not think that special precautions above personal judgement are required for the installation work.

  12. Battlefield innovation: a case-study of remote sensor development

    NASA Astrophysics Data System (ADS)

    Orson, Jay A.; Hague, Tyler N.

    2007-10-01

    Evolving threats encountered by coalition forces in Operation Iraqi Freedom drive the need for innovations in airborne intelligence, surveillance, and reconnaissance capabilities. In many cases, disruptive capabilities are created by linking existing technologies and new radical technologies in a novel way. Some of the radical technologies used in achieving these disruptive capabilities are existing prototypes or one-of-a-kind systems that are thrust into the field to quickly react to emerging threats. Horned Owl is one such rapidly developed innovative technical solution designed to meet immediate battlefield needs. This paper focuses on two key areas of this initiative. The first is the innovation champion establishing a collaborative environment which fosters creativity and allows the project to mature the disruptive capability. The second is the practical implication, or challenges of deploying experimental systems in a battlefield environment. Discussions of these two areas provide valuable lessons to guide future innovation champions when presented with the dual task of balancing system maturation with meeting operational demand. Contents of this paper are not necessarily the official views of, or endorsed by the U.S. Government, the Department of Defense, or the Department of the Air Force.

  13. Detection of the enzymatically-active polyhydroxyalkanoate synthase subunit gene, phaC, in cyanobacteria via colony PCR.

    PubMed

    Lane, Courtney E; Benton, Michael G

    2015-12-01

    A colony PCR-based assay was developed to rapidly determine if a cyanobacterium of interest contains the requisite genetic material, the PHA synthase PhaC subunit, to produce polyhydroxyalkanoates (PHAs). The test is both high throughput and robust, owing to an extensive sequence analysis of cyanobacteria PHA synthases. The assay uses a single detection primer set and a single reaction condition across multiple cyanobacteria strains to produce an easily detectable positive result - amplification via PCR as evidenced by a band in electrophoresis. In order to demonstrate the potential of the presence of phaC as an indicator of a cyanobacteria's PHA accumulation capabilities, the ability to produce PHA was assessed for five cyanobacteria with a traditional in vivo PHA granule staining using an oxazine dye. The confirmed in vivo staining results were then compared to the PCR-based assay results and found to be in agreement. The colony PCR assay was capable of successfully detecting the phaC gene in all six of the diverse cyanobacteria tested which possessed the gene, while exhibiting no undesired product formation across the nine total cyanobacteria strains tested. The colony PCR quick prep provides sufficient usable DNA template such that this assay could be readily expanded to assess multiple genes of interest simultaneously. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Activating β-catenin signaling in CD133-positive dermal papilla cells increases hair inductivity

    PubMed Central

    Zhou, Linli; Yang, Kun; Xu, Mingang; Andl, Thomas; Millar, Sarah; Boyce, Steven; Zhang, Yuhang

    2016-01-01

    Bioengineering hair follicles using cells isolated from human tissue remains as a difficult task. Dermal papilla (DP) cells are known to guide the growth and cycling activities of hair follicles by interacting with keratinocytes. However, DP cells quickly lose their inductivity during in vitro passaging. Rodent DP cell cultures need external addition of chemical factors, including WNT and BMP molecules, to maintain the hair inductive property. CD133 is expressed by a small subpopulation of DP cells that are capable of inducing hair follicle formation in vivo. We report here that expression of a stabilized form of β-catenin promoted clonal growth of CD133-positive (CD133+) DP cells in in vitro three-dimensional hydrogel culture while maintaining expression of DP markers, including alkaline phosphatase (AP), CD133, and Integrin α8. After a two-week in vitro culture, cultured CD133+ DP cells with up-regulated β-catenin activity led to an accelerated in vivo hair growth in reconstituted skin than control cells. Further analysis showed that matrix cell proliferation and differentiation were significantly promoted in hair follicles when β-catenin signaling was upregulated in CD133+ DP cells. Our data highlight an important role for β-catenin signaling in promoting the inductive capability of CD133+ DP cells for in vitro expansion and in vivo hair follicle regeneration, which could potentially be applied to cultured human DP cells. PMID:27312243

  15. TFTR neutral beam control and monitoring for DT operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Connor, T.; Kamperschroer, J.; Chu, J.

    1995-12-31

    Record fusion power output has recently been obtained in TFTR with the injection of deuterium and tritium neutral beams. This significant achievement was due in part to the controls, software, and data processing capabilities added to the neutral beam system for DT operations. Chief among these improvements was the addition of SUN workstations and large dynamic data storage to the existing Central Instrumentation Control and Data Acquisition (CICADA) system. Essentially instantaneous look back over the recent shot history has been provided for most beam waveforms and analysis results. Gas regulation controls allowing remote switchover between deuterium and tritium were alsomore » added. With these tools, comparison of the waveforms and data of deuterium and tritium for four test conditioning pulses quickly produced reliable tritium setpoints. Thereafter, all beam conditioning was performed with deuterium, thus saving the tritium supply for the important DT injection shots. The lookback capability also led to modifications of the gas system to improve reliability and to control ceramic valve leakage by backbiasing. Other features added to improve the reliability and availability of DT neutral beam operations included master beamline controls and displays, a beamline thermocouple interlock system, a peak thermocouple display, automatic gas inventory and cryo panel gas loading monitoring, beam notching controls, a display of beam/plasma interlocks, and a feedback system to control beam power based on plasma conditions.« less

  16. Communication as a Strategic Activity (Invited)

    NASA Astrophysics Data System (ADS)

    Fischhoff, B.

    2010-12-01

    Effective communication requires preparation. The first step is explicit analysis of the decisions faced by audience members, in order to identify the facts essential to their choices. The second step is assessing their current beliefs, in order to identify the gaps in their understanding, as well as their natural ways of thinking. The third step is drafting communications potentially capable of closing those gaps, taking advantage of the relevant behavioral science. The fourth step is empirically evaluating those communications, refining them as necessary. The final step is communicating through trusted channels, capable of getting the message out and receiving needed feedback. Executing these steps requires a team involving subject matter experts (for ensuring that the science is right), decision analysts (for identifying the decision-critical facts), behavioral scientists (for designing and evaluating messages), and communication specialists (for creating credible channels). Larger organizations should be able to assemble those teams and anticipate their communication needs. However, even small organizations, individuals, or large organizations that have been caught flat-footed can benefit from quickly assembling informal teams, before communicating in ways that might undermine their credibility. The work is not expensive, but does require viewing communication as a strategic activity, rather than an afterthought. The talk will illustrate the science base, with a few core research results; note the risks of miscommunication, with a few bad examples; and suggest the opportunities for communication leadership, focusing on the US Food and Drug Administration.

  17. Subsystems component definitions summary program

    NASA Technical Reports Server (NTRS)

    Scott, A. Don; Thomas, Carolyn C.; Simonsen, Lisa C.; Hall, John B., Jr.

    1991-01-01

    A computer program, the Subsystems Component Definitions Summary (SUBCOMDEF), was developed to provide a quick and efficient means of summarizing large quantities of subsystems component data in terms of weight, volume, resupply, and power. The program was validated using Space Station Freedom Program Definition Requirements Document data for the internal and external thermal control subsystem. Once all component descriptions, unit weights and volumes, resupply, and power data are input, the user may obtain a summary report of user-specified portions of the subsystem or of the entire subsystem as a whole. Any combination or all of the parameters of wet and dry weight, wet and dry volume, resupply weight and volume, and power may be displayed. The user may vary the resupply period according to individual mission requirements, as well as the number of hours per day power consuming components operate. Uses of this program are not limited only to subsystem component summaries. Any applications that require quick, efficient, and accurate weight, volume, resupply, or power summaries would be well suited to take advantage of SUBCOMDEF's capabilities.

  18. Photoactive Self-Shaping Hydrogels as Noncontact 3D Macro/Microscopic Photoprinting Platforms.

    PubMed

    Liao, Yue; An, Ning; Wang, Ning; Zhang, Yinyu; Song, Junfei; Zhou, Jinxiong; Liu, Wenguang

    2015-12-01

    A photocleavable terpolymer hydrogel cross-linked with o-nitrobenzyl derivative cross-linker is shown to be capable of self-shaping without losing its physical integrity and robustness due to spontaneous asymmetric swelling of network caused by UV-light-induced gradient cleavage of chemical cross-linkages. The continuum model and finite element method are used to elucidate the curling mechanism underlying. Remarkably, based on the self-changing principle, the photosensitive hydrogels can be developed as photoprinting soft and wet platforms onto which specific 3D characters and images are faithfully duplicated in macro/microscale without contact by UV light irradiation under the cover of customized photomasks. Importantly, a quick response (QR) code is accurately printed on the photoactive hydrogel for the first time. Scanning QR code with a smartphone can quickly connect to a web page. This photoactive hydrogel is promising to be a new printing or recording material. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Quick actuating closure

    NASA Technical Reports Server (NTRS)

    White, III, Dorsey E. (Inventor); Updike, deceased, Benjamin T. (Inventor); Allred, Johnny W. (Inventor)

    1989-01-01

    A quick actuating closure for a pressure vessel 80 in which a wedge ring 30 with a conical outer surface 31 is moved forward to force shear blocks 40, with conical inner surfaces 41, radially outward to lock an end closure plug 70 within an opening 81 in the pressure vessel 80. A seal ring 60 and a preload ramp 50 sit between the shear blocks 40 and the end closure plug 70 to provide a backup sealing capability. Conical surfaces 44 and 55 of the preload ramp 50 and the shear blocks 40 interact to force the seal ring 60 into shoulders 73 and 85 in the end closure plug 70 and opening 81 to form a tight seal. The end closure plug 70 is unlocked by moving the wedge ring 30 rearward, which causes T-bars 32 of the wedge ring 30 riding within T -slots 42 of the shear blocks 40 to force them radially inward. The end closure plug 70 is then removed, allowing access to the interior of the pressure vessel 80.

  20. Optical contrast for identifying the thickness of two-dimensional materials

    NASA Astrophysics Data System (ADS)

    Bing, Dan; Wang, Yingying; Bai, Jing; Du, Ruxia; Wu, Guoqing; Liu, Liyan

    2018-01-01

    One of the most intriguing properties of two-dimensional (2D) materials is their thickness dependent properties. A quick and precise technique to identify the layer number of 2D materials is therefore highly desirable. In this review, we will introduce the basic principle of using optical contrast to determine the thickness of 2D material and also its advantage as compared to other modern techniques. Different 2D materials, including graphene, graphene oxide, transitional metal dichalcogenides, black phosphorus, boron nitride, have been used as examples to demonstrate the capability of optical contrast methods. A simple and more efficient optical contrast image technique is also emphasized, which is suitable for quick and large-scale thickness identification. We have also discussed the factors that could affect the experimental results of optical contrast, including incident light angle, anisotropic nature of materials, and also the twisted angle between 2D layers. Finally, we give perspectives on future development of optical contrast methods for the study and application of 2D materials.

  1. photPARTY: Python Automated Square-Aperture Photometry

    NASA Astrophysics Data System (ADS)

    Symons, Teresa A.

    As CCD's have drastically increased the amount of information recorded per frame, so too have they increased the time and effort needed to sift through the data. For observations of a single star, information from millions of pixels needs to be distilled into one number: the magnitude. Various computer systems have been used to streamline this process over the years. The CCDPhot photometer, in use at the Kitt Peak 0.9-m telescope in the 1990's, allowed for user settings and provided real time magnitudes during observation of single stars. It is this level of speed and convenience that inspired the development of the Python-based software analysis system photPARTY, which can quickly and efficiently produce magnitudes for a set of single- star or un-crowded field CCD frames. Seeking to remove the need for manual interaction after initial settings for a group of images, photPARTY automatically locates stars, subtracts the background, and performs square-aperture photometry. Rather than being a package of available functions, it is essentially a self-contained, one-click analysis system, with the capability to process several hundred frames in just a couple of minutes. Results of comparisons with present systems such as IRAF are presented.

  2. Practical applications of remote sensing technology

    NASA Technical Reports Server (NTRS)

    Whitmore, Roy A., Jr.

    1990-01-01

    Land managers increasingly are becoming dependent upon remote sensing and automated analysis techniques for information gathering and synthesis. Remote sensing and geographic information system (GIS) techniques provide quick and economical information gathering for large areas. The outputs of remote sensing classification and analysis are most effective when combined with a total natural resources data base within the capabilities of a computerized GIS. Some examples are presented of the successes, as well as the problems, in integrating remote sensing and geographic information systems. The need to exploit remotely sensed data and the potential that geographic information systems offer for managing and analyzing such data continues to grow. New microcomputers with vastly enlarged memory, multi-fold increases in operating speed and storage capacity that was previously available only on mainframe computers are a reality. Improved raster GIS software systems have been developed for these high performance microcomputers. Vector GIS systems previously reserved for mini and mainframe systems are available to operate on these enhanced microcomputers. One of the more exciting areas that is beginning to emerge is the integration of both raster and vector formats on a single computer screen. This technology will allow satellite imagery or digital aerial photography to be presented as a background to a vector display.

  3. Using Reconstructed POD Modes as Turbulent Inflow for LES Wind Turbine Simulations

    NASA Astrophysics Data System (ADS)

    Nielson, Jordan; Bhaganagar, Kiran; Juttijudata, Vejapong; Sirisup, Sirod

    2016-11-01

    Currently, in order to get realistic atmospheric effects of turbulence, wind turbine LES simulations require computationally expensive precursor simulations. At times, the precursor simulation is more computationally expensive than the wind turbine simulation. The precursor simulations are important because they capture turbulence in the atmosphere and as stated above, turbulence impacts the power production estimation. On the other hand, POD analysis has been shown to be capable of capturing turbulent structures. The current study was performed to determine the plausibility of using lower dimension models from POD analysis of LES simulations as turbulent inflow to wind turbine LES simulations. The study will aid the wind energy community by lowering the computational cost of full scale wind turbine LES simulations, while maintaining a high level of turbulent information and being able to quickly apply the turbulent inflow to multi turbine wind farms. This will be done by comparing a pure LES precursor wind turbine simulation with simulations that use reduced POD mod inflow conditions. The study shows the feasibility of using lower dimension models as turbulent inflow of LES wind turbine simulations. Overall the power production estimation and velocity field of the wind turbine wake are well captured with small errors.

  4. An in vitro and in vivo investigation of the biological behavior of a ferrimagnetic cement for highly focalized thermotherapy.

    PubMed

    Portela, Ana; Vasconcelos, Mário; Branco, Rogério; Gartner, Fátima; Faria, Miguel; Cavalheiro, José

    2010-08-01

    The cancer treatment by local hyperthermia, using a high frequency electromagnetic field is an extensively studied subject. For this propose it was developed a ferrimagnetic cement (FC) to be injected directly into the tumor. In this study it was determined the FC injectability, its capability to generate heat when placed within a magnetic field and its interaction with a modified simulated body fluid using SEM/EDS and XRD. The FC biological response was assessed by the intramuscular implantation in rats and histological analysis of the surrounding tissues. The results suggest that FC can be injected directly into the tumor, its temperature can be increased when exposed to a magnetic field and the surface of the immersed samples quickly becomes coated with precipitate denoting its ionic change with the surrounding medium. The histological analysis revealed a transient local inflammatory reaction, similar to the control material, only slightly more abundant during the first weeks, with a gradual decrease over the implantation time. Based on these results, we concluded that FC might be useful for highly focalized thermotherapy, with a good potential for clinical use.

  5. Epidemiological monitoring for emerging infectious diseases

    NASA Astrophysics Data System (ADS)

    Greene, Marjorie

    2010-04-01

    The Homeland Security News Wire has been reporting on new ways to fight epidemics using digital tools such as iPhone, social networks, Wikipedia, and other Internet sites. Instant two-way communication now gives consumers the ability to complement official reports on emerging infectious diseases from health authorities. However, there is increasing concern that these communications networks could open the door to mass panic from unreliable or false reports. There is thus an urgent need to ensure that epidemiological monitoring for emerging infectious diseases gives health authorities the capability to identify, analyze, and report disease outbreaks in as timely and efficient a manner as possible. One of the dilemmas in the global dissemination of information on infectious diseases is the possibility that information overload will create inefficiencies as the volume of Internet-based surveillance information increases. What is needed is a filtering mechanism that will retrieve relevant information for further analysis by epidemiologists, laboratories, and other health organizations so they are not overwhelmed with irrelevant information and will be able to respond quickly. This paper introduces a self-organizing ontology that could be used as a filtering mechanism to increase relevance and allow rapid analysis of disease outbreaks as they evolve in real time.

  6. Determination of ribavirin in chicken muscle by quick, easy, cheap, effective, rugged and safe method and liquid chromatography-tandem mass spectrometry.

    PubMed

    Wu, Yin-Liang; Chen, Ruo-Xia; Zhu, Lie; Lv, Yan; Zhu, Yong; Zhao, Jian

    2016-02-15

    A new analytical method for the determination of ribavirin in chicken muscle using a QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) method and liquid chromatography-tandem mass spectrometry (LC-MS-MS) was developed and validated. Samples were extracted with acidified methanol (methanol:acetic acid, 99:1, v/v). The extract was further purified by QuEChERS method using primary-secondary amine (PSA) and C18. Finally, the extract was dried by nitrogen under 45°C and reconstituted in water. The separation was performed on a Hypercarb analytical column under a gradient elution. The mobile phase was composed of water buffered with ammonium acetate (2.0mM) and acetonitrile. The proposed method was validated according to the European Commission Decision 2002/657/EC. The values of the decision limit (CCα) and the detection capability (CCβ) were 1.1 and 1.5μg/kg, respectively. The mean recoveries of ribavirin ranged from 94.2% to 99.2%. The repeatability (expressed as coefficient of variation, CVr) of the method ranged from 4.5% to 4.9% and the reproducibility (CVR) of the method ranged from 4.8% to 5.4%. The method is demonstrated to be suitable for the determination of ribavirin in chicken muscle in conformity with the current EU performance requirements through validation. The total time required for the analysis of one sample, including sample preparation, was about 45min. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Quick detection of Leifsonia xyli subsp. xyli by PCR and necleotide sequence analysis of PCR amplicons from Chinese Leifsonia xyli subsp. xyli isolates

    USDA-ARS?s Scientific Manuscript database

    A quick polymerase chain reaction (PCR) assay was developed for the detection of Leifsonia xyli subsp. xyli (Lxx), the bacterial causal agent of ratoon stunting disease (RSD) of sugarcane, in crude juice samples from stalks. After removal of abiotic impurities and large molecular weight microorgani...

  8. The influence of education and income on responses to the QuickDASH questionnaire.

    PubMed

    Finsen, V

    2015-05-01

    We studied the influence of levels of income and education on QuickDASH scores. The scores were collected in a random sample of 1376 residents of Norway. The level of income was divided into four bands and level of education into five bands. The mean QuickDASH score for both men and women fell with every increase in education and income level. For women the mean score was 30 for those with the shortest education and 9 for those with the longest (p < 0.001). The corresponding figures for men were 19 and 7 (p < 0.01). The women with the lowest level of income had a mean score of 23, compared with 8 for women with the highest income level (p < 0.001). For men the corresponding mean scores were 20 and 5 (p < 0.001). Analysis of variance showed that age alone accounted for 16% of the variability of the scores among women and 7% among men. When levels of education and income were added to the analysis, these three factors accounted for 21% of the variability among women and 13% among men. We conclude that socioeconomic factors significantly influence QuickDASH scores. 3. © The Author(s) 2014.

  9. WorldView-2 and the evolution of the DigitalGlobe remote sensing satellite constellation: introductory paper for the special session on WorldView-2

    NASA Astrophysics Data System (ADS)

    Anderson, Neal T.; Marchisio, Giovanni B.

    2012-06-01

    Over the last decade DigitalGlobe (DG) has built and launched a series of remote sensing satellites with steadily increasing capabilities: QuickBird, WorldView-1 (WV-1), and WorldView-2 (WV-2). Today, this constellation acquires over 2.5 million km2 of imagery on a daily basis. This paper presents the configuration and performance capabilities of each of these satellites, with emphasis on the unique spatial and spectral capabilities of WV-2. WV-2 employs high-precision star tracker and inertial measurement units to achieve a geolocation accuracy of 5 m Circular Error, 90% confidence (CE90). The native resolution of WV-2 is 0.5 m GSD in the panchromatic band and 2 m GSD in 8 multispectral bands. Four of the multispectral bands match those of the Landsat series of satellites; four new bands enable novel and expanded applications. We are rapidly establishing and refreshing a global database of very high resolution (VHR) 8-band multispectral imagery. Control moment gyroscopes (CMGs) on both WV-1 and WV-2 improve collection capacity and provide the agility to capture multi-angle sequences in rapid succession. These capabilities result in a rich combination of image features that can be exploited to develop enhanced monitoring solutions. Algorithms for interpretation and analysis can leverage: 1) broader and more continuous spectral coverage at 2 m resolution; 2) textural and morphological information from the 0.5 m panchromatic band; 3) ancillary information from stereo and multi-angle collects, including high precision digital elevation models; 4) frequent revisits and time-series collects; and 5) the global reference image archives. We introduce the topic of creative fusion of image attributes, as this provides a unifying theme for many of the papers in this WV-2 Special Session.

  10. Comparison of Quick-Set and mineral trioxide aggregate root-end fillings for the regeneration of apical tissues in dogs.

    PubMed

    Kohout, George D; He, Jianing; Primus, Carolyn M; Opperman, Lynne A; Woodmansey, Karl F

    2015-02-01

    Quick-Set (Avalon Biomed Inc, Bradenton, FL) is a calcium aluminosilicate cement that is a potential alternative to mineral trioxide aggregate (MTA) with greater acid resistance and faster setting. The purpose of this study was to compare the regeneration of apical tissues after root-end surgery when the apical tissues were exposed to Quick-Set or White ProRoot MTA (Dentsply Tulsa Dental Specialties, Tulsa, OK) by root-end resection. The root canals of 42 mandibular premolars in 7 beagle dogs were accessed, cleaned and shaped, and obturated with Quick-Set or white MTA. Osteotomies and root-end resections were performed immediately. The dogs were sacrificed at 90 days, and the teeth and surrounding tissues were removed and prepared for histologic analysis. The sections of the apical areas were scored for inflammation, new cementum formation, periodontal ligament formation, and bone quality. At 90 days, both materials supported some degree of cementum formation on the surface of the material, periodontal ligament regeneration, and excellent bone quality. The only significant difference was greater inflammation found in the Quick-Set group. Quick-Set and White ProRoot MTA had a similar effect on bone quality, cementum formation, and periodontal ligament formation after root-end surgery in dogs. Quick-Set was associated with greater inflammation. Copyright © 2015 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. Land cover mapping and change detection in urban watersheds using QuickBird high spatial resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Hester, David Barry

    The objective of this research was to develop methods for urban land cover analysis using QuickBird high spatial resolution satellite imagery. Such imagery has emerged as a rich commercially available remote sensing data source and has enjoyed high-profile broadcast news media and Internet applications, but methods of quantitative analysis have not been thoroughly explored. The research described here consists of three studies focused on the use of pan-sharpened 61-cm spatial resolution QuickBird imagery, the spatial resolution of which is the highest of any commercial satellite. In the first study, a per-pixel land cover classification method is developed for use with this imagery. This method utilizes a per-pixel classification approach to generate an accurate six-category high spatial resolution land cover map of a developing suburban area. The primary objective of the second study was to develop an accurate land cover change detection method for use with QuickBird land cover products. This work presents an efficient fuzzy framework for transforming map uncertainty into accurate and meaningful high spatial resolution land cover change analysis. The third study described here is an urban planning application of the high spatial resolution QuickBird-based land cover product developed in the first study. This work both meaningfully connects this exciting new data source to urban watershed management and makes an important empirical contribution to the study of suburban watersheds. Its analysis of residential roads and driveways as well as retail parking lots sheds valuable light on the impact of transportation-related land use on the suburban landscape. Broadly, these studies provide new methods for using state-of-the-art remote sensing data to inform land cover analysis and urban planning. These methods are widely adaptable and produce land cover products that are both meaningful and accurate. As additional high spatial resolution satellites are launched and the cost of high resolution imagery continues to decline, this research makes an important contribution to this exciting era in the science of remote sensing.

  12. Ballistic Puncture Self-Healing Polymeric Materials

    NASA Technical Reports Server (NTRS)

    Gordon, Keith L.; Siochi, Emilie J.; Yost, William T.; Bogert, Phil B.; Howell, Patricia A.; Cramer, K. Elliott; Burke, Eric R.

    2017-01-01

    Space exploration launch costs on the order of $10,000 per pound provide an incentive to seek ways to reduce structural mass while maintaining structural function to assure safety and reliability. Damage-tolerant structural systems provide a route to avoiding weight penalty while enhancing vehicle safety and reliability. Self-healing polymers capable of spontaneous puncture repair show promise to mitigate potentially catastrophic damage from events such as micrometeoroid penetration. Effective self-repair requires these materials to quickly heal following projectile penetration while retaining some structural function during the healing processes. Although there are materials known to possess this capability, they are typically not considered for structural applications. Current efforts use inexpensive experimental methods to inflict damage, after which analytical procedures are identified to verify that function is restored. Two candidate self-healing polymer materials for structural engineering systems are used to test these experimental methods.

  13. The ALBA spectroscopic LEEM-PEEM experimental station: layout and performance

    PubMed Central

    Aballe, Lucia; Foerster, Michael; Pellegrin, Eric; Nicolas, Josep; Ferrer, Salvador

    2015-01-01

    The spectroscopic LEEM-PEEM experimental station at the CIRCE helical undulator beamline, which started user operation at the ALBA Synchrotron Light Facility in 2012, is presented. This station, based on an Elmitec LEEM III microscope with electron imaging energy analyzer, permits surfaces to be imaged with chemical, structural and magnetic sensitivity down to a lateral spatial resolution better than 20 nm with X-ray excited photoelectrons and 10 nm in LEEM and UV-PEEM modes. Rotation around the surface normal and application of electric and (weak) magnetic fields are possible in the microscope chamber. In situ surface preparation capabilities include ion sputtering, high-temperature flashing, exposure to gases, and metal evaporation with quick evaporator exchange. Results from experiments in a variety of fields and imaging modes will be presented in order to illustrate the ALBA XPEEM capabilities. PMID:25931092

  14. Optical systems integrated modeling

    NASA Technical Reports Server (NTRS)

    Shannon, Robert R.; Laskin, Robert A.; Brewer, SI; Burrows, Chris; Epps, Harlan; Illingworth, Garth; Korsch, Dietrich; Levine, B. Martin; Mahajan, Vini; Rimmer, Chuck

    1992-01-01

    An integrated modeling capability that provides the tools by which entire optical systems and instruments can be simulated and optimized is a key technology development, applicable to all mission classes, especially astrophysics. Many of the future missions require optical systems that are physically much larger than anything flown before and yet must retain the characteristic sub-micron diffraction limited wavefront accuracy of their smaller precursors. It is no longer feasible to follow the path of 'cut and test' development; the sheer scale of these systems precludes many of the older techniques that rely upon ground evaluation of full size engineering units. The ability to accurately model (by computer) and optimize the entire flight system's integrated structural, thermal, and dynamic characteristics is essential. Two distinct integrated modeling capabilities are required. These are an initial design capability and a detailed design and optimization system. The content of an initial design package is shown. It would be a modular, workstation based code which allows preliminary integrated system analysis and trade studies to be carried out quickly by a single engineer or a small design team. A simple concept for a detailed design and optimization system is shown. This is a linkage of interface architecture that allows efficient interchange of information between existing large specialized optical, control, thermal, and structural design codes. The computing environment would be a network of large mainframe machines and its users would be project level design teams. More advanced concepts for detailed design systems would support interaction between modules and automated optimization of the entire system. Technology assessment and development plans for integrated package for initial design, interface development for detailed optimization, validation, and modeling research are presented.

  15. Ultrasensitive Ambient Mass Spectrometric Analysis with a Pin-to-Capillary Flowing Atmospheric-Pressure Afterglow Source

    PubMed Central

    Shelley, Jacob T.; Wiley, Joshua S.; Hieftje, Gary M.

    2011-01-01

    The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the Flowing Atmospheric-Pressure Afterglow (FAPA). FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn, and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown. PMID:21627097

  16. Ultrasensitive ambient mass spectrometric analysis with a pin-to-capillary flowing atmospheric-pressure afterglow source.

    PubMed

    Shelley, Jacob T; Wiley, Joshua S; Hieftje, Gary M

    2011-07-15

    The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the flowing atmospheric-pressure afterglow (FAPA). The FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown.

  17. Optical Imaging and Radiometric Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.

    2010-01-01

    OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge diffusion modulation transfer function (MTF).

  18. Evaluation of CFD Methods for Simulation of Two-Phase Boiling Flow Phenomena in a Helical Coil Steam Generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David; Shaver, Dillon; Liu, Yang

    The U.S. Department of Energy, Office of Nuclear Energy charges participants in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program with the development of advanced modeling and simulation capabilities that can be used to address design, performance and safety challenges in the development and deployment of advanced reactor technology. The NEAMS has established a high impact problem (HIP) team to demonstrate the applicability of these tools to identification and mitigation of sources of steam generator flow induced vibration (SGFIV). The SGFIV HIP team is working to evaluate vibration sources in an advanced helical coil steam generator using computational fluidmore » dynamics (CFD) simulations of the turbulent primary coolant flow over the outside of the tubes and CFD simulations of the turbulent multiphase boiling secondary coolant flow inside the tubes integrated with high resolution finite element method assessments of the tubes and their associated structural supports. This report summarizes the demonstration of a methodology for the multiphase boiling flow analysis inside the helical coil steam generator tube. A helical coil steam generator configuration has been defined based on the experiments completed by Polytecnico di Milano in the SIET helical coil steam generator tube facility. Simulations of the defined problem have been completed using the Eulerian-Eulerian multi-fluid modeling capabilities of the commercial CFD code STAR-CCM+. Simulations suggest that the two phases will quickly stratify in the slightly inclined pipe of the helical coil steam generator. These results have been successfully benchmarked against both empirical correlations for pressure drop and simulations using an alternate CFD methodology, the dispersed phase mixture modeling capabilities of the open source CFD code Nek5000.« less

  19. Syndromic surveillance: hospital emergency department participation during the Kentucky Derby Festival.

    PubMed

    Carrico, Ruth; Goss, Linda

    2005-01-01

    Electronic syndromic surveillance may have value in detecting emerging pathogens or a biological weapons release. Hospitals that have an agile process to evaluate chief complaints of patients seeking emergency care may be able to more quickly identify subtle changes in the community's health. An easily adaptable prototype system was developed to monitor emergency department patient visits during the Kentucky Derby Festival in Louisville, Kentucky, from April 16-May 14, 2002. Use of the system was continued during the same festival periods in 2003 and 2004. Twelve area hospitals in Louisville, Kentucky, participated in a prospective analysis of the chief symptoms of patients who sought care in the emergency department during the Kentucky Derby Festival during 2002. Six hospitals were classified as computer record groups (CRG) and used their existing computerized record capabilities. The other 6 hospitals used a personal digital assistant (PDA) with customized software (PDA group). Data were evaluated by the health department epidemiologist using SaTScan, a modified version of a cancer cluster detection program, to look for clusters of cases above baseline over time and by Zip code. All 12 hospitals were able to collect and provide data elements during the study period. The 6 CRG hospitals were able to perform daily data transmission; however, 3 CRG hospitals were unable to interpret their data because it was transmitted in pure text format. In contrast, data from all 6 PDA group hospitals were interpretable. Real-time data analysis was compared with post-event data, and it was found that the real-time evaluation correctly identified no unusual disease activity during the study period. The 12 hospitals participating in this study demonstrated that community-wide surveillance using computerized data was possible and that the 6 study hospitals using a PDA could quickly interpret emergency department patients' chief complaints. The emergency department chief complaints group could serve as a disease sentinel for the community.

  20. Adding tools to the open source toolbox: The Internet

    NASA Technical Reports Server (NTRS)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  1. An "artificial retina" processor for track reconstruction at the full LHC crossing rate

    NASA Astrophysics Data System (ADS)

    Abba, A.; Bedeschi, F.; Caponio, F.; Cenci, R.; Citterio, M.; Cusimano, A.; Fu, J.; Geraci, A.; Grizzuti, M.; Lusardi, N.; Marino, P.; Morello, M. J.; Neri, N.; Ninci, D.; Petruzzo, M.; Piucci, A.; Punzi, G.; Ristori, L.; Spinella, F.; Stracka, S.; Tonelli, D.; Walsh, J.

    2016-07-01

    We present the latest results of an R&D study for a specialized processor capable of reconstructing, in a silicon pixel detector, high-quality tracks from high-energy collision events at 40 MHz. The processor applies a highly parallel pattern-recognition algorithm inspired to quick detection of edges in mammals visual cortex. After a detailed study of a real-detector application, demonstrating that online reconstruction of offline-quality tracks is feasible at 40 MHz with sub-microsecond latency, we are implementing a prototype using common high-bandwidth FPGA devices.

  2. A parallel algorithm for finding the shortest exit paths in mines

    NASA Astrophysics Data System (ADS)

    Jastrzab, Tomasz; Buchcik, Agata

    2017-11-01

    In the paper we study the problem of finding the shortest exit path in an underground mine in case of emergency. Since emergency situations, such as underground fires, can put the miners' lives at risk, the ability to quickly determine the safest exit path is crucial. We propose a parallel algorithm capable of finding the shortest path between the safe exit point and any other point in the mine. The algorithm is also able to take into account the characteristics of individual miners, to make the path determination more reliable.

  3. An "artificial retina" processor for track reconstruction at the full LHC crossing rate

    DOE PAGES

    Abba, A.; F. Bedeschi; Caponio, F.; ...

    2015-10-23

    Here, we present the latest results of an R&D; study for a specialized processor capable of reconstructing, in a silicon pixel detector, high-quality tracks from high-energy collision events at 40 MHz. The processor applies a highly parallel pattern-recognition algorithm inspired to quick detection of edges in mammals visual cortex. After a detailed study of a real-detector application, demonstrating that online reconstruction of offline-quality tracks is feasible at 40 MHz with sub-microsecond latency, we are implementing a prototype using common high-bandwidth FPGA devices.

  4. Audit Report. Quick-Reaction Report on the Review of Defense Base Closure and Realignment Budget Data for Carswell, Barksdale, Dyess, Minot, and Tinker Air Force Bases

    DTIC Science & Technology

    1992-11-27

    for 10 construction pr~n’jcts for realigning Carswell AFB was not adeauately documented as required by Air Force Regulation (AFR) 86-1, " Programming ...Engineering Programming , Standard Facility Requirements." paragraph 24-70, allows for a total of 25,200 square feet of space for tne warehouse and...cantonment area, and replaced an existing capability. The fact that a replacement wash rack was previously programmed does not alter this requirement

  5. Spatial Query for Planetary Data

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja S.; Crockett, Thomas M.; Powell, Mark W.; Joswig, Joseph C.; Fox, Jason M.

    2011-01-01

    Science investigators need to quickly and effectively assess past observations of specific locations on a planetary surface. This innovation involves a location-based search technology that was adapted and applied to planetary science data to support a spatial query capability for mission operations software. High-performance location-based searching requires the use of spatial data structures for database organization. Spatial data structures are designed to organize datasets based on their coordinates in a way that is optimized for location-based retrieval. The particular spatial data structure that was adapted for planetary data search is the R+ tree.

  6. Real-time ISEE data system

    NASA Technical Reports Server (NTRS)

    Tsurutani, B. T.; Baker, D. N.

    1979-01-01

    A real-time ISEE data system directed toward predicting geomagnetic substorms and storms is discussed. Such a system may allow up to 60+ minutes advance warning of magnetospheric substorms and up to 30 minute warnings of geomagnetic storms (and other disturbances) induced by high-speed streams and solar flares. The proposed system utilizes existing capabilities of several agencies (NASA, NOAA, USAF), and thereby minimizes costs. This same concept may be applicable to data from other spacecraft, and other NASA centers; thus, each individual experimenter can receive quick-look data in real time at his or her base institution.

  7. Combined Hydrologic (AGWA-KINEROS2) and Hydraulic (HEC2) Modeling for Post-Fire Runoff and Inundation Risk Assessment through a Set of Python Tools

    NASA Astrophysics Data System (ADS)

    Barlow, J. E.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.

    2016-12-01

    Wildfires in the Western United States can alter landscapes by removing vegetation and changing soil properties. These altered landscapes produce more runoff than pre-fire landscapes which can lead to post-fire flooding that can damage infrastructure and impair natural resources. Resources, structures, historical artifacts and others that could be impacted by increased runoff are considered values at risk. .The Automated Geospatial Watershed Assessment tool (AGWA) allows users to quickly set up and execute the Kinematic Runoff and Erosion model (KINEROS2 or K2) in the ESRI ArcMap environment. The AGWA-K2 workflow leverages the visualization capabilities of GIS to facilitate evaluation of rapid watershed assessments for post-fire planning efforts. High relative change in peak discharge, as simulated by K2, provides a visual and numeric indicator to investigate those channels in the watershed that should be evaluated for more detailed analysis, especially if values at risk are within or near that channel. Modeling inundation extent along a channel would provide more specific guidance about risk along a channel. HEC-2 and HEC-RAS can be used for hydraulic modeling efforts at the reach and river system scale. These models have been used to address flood boundaries and, accordingly, flood risk. However, data collection and organization for hydraulic models can be time consuming and therefore a combined hydrologic-hydraulic modeling approach is not often employed for rapid assessments. A simplified approach could streamline this process and provide managers with a simple workflow and tool to perform a quick risk assessment for a single reach. By focusing on a single reach highlighted by large relative change in peak discharge, data collection efforts can be minimized and the hydraulic computations can be performed to supplement risk analysis. The incorporation of hydraulic analysis through a suite of Python tools (as outlined by HEC-2) with AGWA-K2 will allow more rapid applications of combined hydrologic-hydraulic modeling. This combined modeling approach is built in the ESRI ArcGIS application to enable rapid model preparation, execution and result visualization for risk assessment in post-fire environments.

  8. RL10 Engine Ability to Transition from Atlas to Shuttle/Centaur Program

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    2015-01-01

    A key launch vehicle design feature is the ability to take advantage of new technologies while minimizing expensive and time consuming development and test programs. With successful space launch experiences and the unique features of both the National Aeronautics and Space Administration (NASA) Space Transportation System (Space Shuttle) and Atlas/Centaur programs, it became attractive to leverage these capabilities. The Shuttle/Centaur Program was created to transition the existing Centaur vehicle to be launched from the Space Shuttle cargo bay. This provided the ability to launch heaver and larger payloads, and take advantage of new unique launch operational capabilities. A successful Shuttle/Centaur Program required the Centaur main propulsion system to quickly accommodate the new operating conditions for two new Shuttle/Centaur configurations and evolve to function in the human Space Shuttle environment. This paper describes the transition of the Atlas/Centaur RL10 engine to the Shuttle/Centaur configurations; shows the unique versatility and capability of the engine; and highlights the importance of ground testing. Propulsion testing outcomes emphasize the value added benefits of testing heritage hardware and the significant impact to existing and future programs.

  9. RL10 Engine Ability to Transition from Atlas to Shuttle/Centaur Program

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    2014-01-01

    A key launch vehicle design feature is the ability to take advantage of new technologies while minimizing expensive and time consuming development and test programs. With successful space launch experiences and the unique features of both the National Aeronautics and Space Administration (NASA) Space Transportation System (Space Shuttle) and Atlas/Centaur programs, it became attractive to leverage these capabilities. The Shuttle/Centaur Program was created to transition the existing Centaur vehicle to be launched from the Space Shuttle cargo bay. This provided the ability to launch heaver and larger payloads, and take advantage of new unique launch operational capabilities. A successful Shuttle/Centaur Program required the Centaur main propulsion system to quickly accommodate the new operating conditions for two new Shuttle/Centaur configurations and evolve to function in the human Space Shuttle environment. This paper describes the transition of the Atlas/Centaur RL10 engine to the Shuttle/Centaur configurations; shows the unique versatility and capability of the engine; and highlights the importance of ground testing. Propulsion testing outcomes emphasize the value added benefits of testing heritage hardware and the significant impact to existing and future programs.

  10. The IRIS-GUS Shuttle Borne Upper Stage System

    NASA Technical Reports Server (NTRS)

    Tooley, Craig; Houghton, Martin; Bussolino, Luigi; Connors, Paul; Broudeur, Steve (Technical Monitor)

    2002-01-01

    This paper describes the Italian Research Interim Stage - Gyroscopic Upper Stage (IRIS-GUS) upper stage system that will be used to launch NASA's Triana Observatory from the Space Shuttle. Triana is a pathfinder earth science mission being executed on rapid schedule and small budget, therefore the mission's upper stage solution had to be a system that could be fielded quickly at relatively low cost and risk. The building of the IRIS-GUS system wa necessary because NASA lost the capability to launch moderately sized upper stage missions fro the Space Shuttle when the PAM-D system was retired. The IRIS-GUS system restores this capability. The resulting system is a hybrid which mates the existing, flight proven IRIS (Italian Research Interim Stage) airborne support equipment to a new upper stage, the Gyroscopic Upper Stage (GUS) built by the GSFC for Triana. Although a new system, the GUS exploits flight proven hardware and design approaches in most subsystems, in some cases implementing proven design approaches with state-of-the-art electronics. This paper describes the IRIS-GUS upper stage system elements, performance capabilities, and payload interfaces.

  11. Comparison of Quick Lactose Intolerance Test in duodenal biopsies of dyspeptic patients with single nucleotide polymorphism LCT-13910C>T associated with primary hypolactasia/lactase-persistence.

    PubMed

    Mattar, Rejane; Basile-Filho, Anibal; Kemp, Rafael; Santos, José Sebastião dos

    2013-01-01

    To analyze the usefulness of Quick Lactose Intolerance Test in relation to the genetic test based on LCT-13910C>T genotypes, previously validated for clinical practice, for primary hypolactasia/lactase-persistence diagnosis. Thirty-two dyspeptic patients that underwent upper gastrointestinal endoscopy entered the study. Two postbulbar duodenal biopsies were taken for the Quick test, and gastric antral biopsy for DNA extraction and LCT-13910C>T polymorphism analysis. DNA was also extracted from biopsies after being used in the Quick Test that was kept frozen until extraction. Nine patients with lactase-persistence genotype (LCT-13910CT or LCT-13910TT) had normolactasia, eleven patients with hypolactasia genotype (LCT-13910CC) had severe hypolactasia, and among twelve with mild hypolactasia, except for one that had LCT-13910CT genotype, all the others had hypolactasia genotype. The agreement between genetic test and quick test was high (p<0.0001; Kappa Index 0.92). Most of the patients that reported symptoms with lactose-containing food ingestion had severe hypolactasia (p<0.05). Amplification with good quality PCR product was also obtained with DNA extracted from biopsies previously used in the Quick Test; thus, for the future studies antral gastric biopsies for genetic test would be unnecessary. Quick test is highly sensitive and specific for hypolactasia diagnosis and indicated those patients with symptoms of lactose intolerance.

  12. Understanding climatological, instantaneous and reference VTEC maps, its variability, its relation to STEC and its assimilation by VTEC models

    NASA Astrophysics Data System (ADS)

    Orus, R.; Prieto-Cerdeira, R.

    2012-12-01

    As the next Solar Maximum peak is approaching, forecasted for the late 2013, it is a good opportunity to study the ionospheric behaviour in such conditions and how this behaviour can be estimated and corrected by existing climatological models - e.g.. NeQuick, International Reference Ionosphere (IRI)- , as well as, GNSS driven models, such as Klobuchar, NeQuick Galileo, SBAS MOPS (EGNOS and WAAS corrections) and Near Real Time Global Ionospheric Maps (GIM) or regional Maps computed by different institutions. In this framework, technology advances allow to increase the computational and radio frequency channels capabilities of low-cost receivers embedded in handheld devices (such mobile phones, pads, trekking clocks, photo-cameras, etc). This may enable the active use of received ionospheric data or correction parameters from different data sources. The study is centred in understanding the ionosphere but focusing on its impact on the position error for low-cost single-frequency receivers. This study tests optimal ways to take advantage of a big amount of Real or Near Real Time ionospheric information and the way to combine various corrections in order to reach a better navigation solution. In this context, the use of real time estimation vTEC data coming from EGNOS or WAAS or near real time GIMs are used to feed the standard GPS single-frequency ionospheric correction models (Klobuchar) and get enhanced Ionospheric corrections with minor changes on the navigation software. This is done by using a Taylor expansion over the 8 coefficients send by GPS. Moreover, the same datasets are used to assimilate it in NeQuick, for broadcast coefficients, as well as, for grid assimilation. As a side product, electron density profiles in Near Real Time could be estimated with data assimilated from different ionospheric sources. Finally, the ionospheric delay estimation for multi-constellation receivers could take benefit from a common and more accurate ionospheric model being able to reduce the position error due to ionosphere. Therefore, a performance study of the different models to navigate with GNSS will be presented in different ionospheric conditions and using different sources for the model adjustment, keeping the real time capability of the receivers.

  13. Reduction and analysis of data collected during the electromagnetic tornado experiment

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.; Bradbury, J.

    1975-01-01

    Progress is reviewed on the reduction and analysis of tornado data collected on analog tape. The strip chart recording of 7 tracks from all available analog data for quick look analysis is emphasized.

  14. Choosing a New Telephone System for Your Medical Practice.

    PubMed

    Metherell, Brian

    2016-01-01

    E-mail may rule the world in other types of businesses, but for medical practices, the telephone remains the primary mode of communication with patients, specialists, and pharmacies. From making appointments to calling in prescriptions, telephones are essential to patient care. With technology changing very quickly and new capabilities coming into the medical practice, such as telemedicine and Skype, you need to know your options when choosing a new telephone system. The possibilities include on-site, cloud, and hybrid networked solutions. A wide variety of features and capabilities are available, from dozens of vendors. Of course, no matter what telephone solution you choose, you must meet regulatory compliance, particularly HIPAA, and Payment Card Industry Data Security Standard if you take credit cards. And it has to be affordable, reliable, and long lasting. This article explores what medical practices need to know when choosing a new business telephone system in order to find the right solutions for their businesses.

  15. Application of information technology to the National Launch System

    NASA Technical Reports Server (NTRS)

    Mauldin, W. T.; Smith, Carolyn L.; Monk, Jan C.; Davis, Steve; Smith, Marty E.

    1992-01-01

    The approach to the development of the Unified Information System (UNIS) to provide in a timely manner all the information required to manage, design, manufacture, integrate, test, launch, operate, and support the Advanced Launch System (NLS), as well as the current and planned capabilities are described. STESYM, the Space Transportation Main Engine (STME) development program, is comprised of a collection of data models which can be grouped into two primary models: the Engine Infrastructure Model (ENGIM) and the Engine Integrated Cast Model (ENGICOM). ENGIM is an end-to-end model of the infrastructure needed to perform the fabrication, assembly, and testing of the STEM program and its components. Together, UNIS and STESYM are to provide NLS managers and engineers with the ability to access various types and files of data quickly and use that data to assess the capabilities of the STEM program.

  16. Concerns over modeling and warning capabilities in wake of Tohoku Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2011-04-01

    Improved earthquake models, better tsunami modeling and warning capabilities, and a review of nuclear power plant safety are all greatly needed following the 11 March Tohoku earthquake and tsunami, according to scientists at the European Geosciences Union's (EGU) General Assembly, held 3-8 April in Vienna, Austria. EGU quickly organized a morning session of oral presentations and an afternoon panel discussion less than 1 month after the earthquake and the tsunami and the resulting crisis at Japan's Fukushima nuclear power plant, which has now been identified as having reached the same level of severity as the 1986 Chernobyl disaster. Many of the scientists at the EGU sessions expressed concern about the inability to have anticipated the size of the earthquake and the resulting tsunami, which appears likely to have caused most of the fatalities and damage, including damage to the nuclear plant.

  17. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  18. Quick, easy, cheap, effective, rugged, and safe sample preparation approach for pesticide residue analysis using traditional detectors in chromatography: A review.

    PubMed

    Rahman, Md Musfiqur; Abd El-Aty, A M; Kim, Sung-Woo; Shin, Sung Chul; Shin, Ho-Chul; Shim, Jae-Han

    2017-01-01

    In pesticide residue analysis, relatively low-sensitivity traditional detectors, such as UV, diode array, electron-capture, flame photometric, and nitrogen-phosphorus detectors, have been used following classical sample preparation (liquid-liquid extraction and open glass column cleanup); however, the extraction method is laborious, time-consuming, and requires large volumes of toxic organic solvents. A quick, easy, cheap, effective, rugged, and safe method was introduced in 2003 and coupled with selective and sensitive mass detectors to overcome the aforementioned drawbacks. Compared to traditional detectors, mass spectrometers are still far more expensive and not available in most modestly equipped laboratories, owing to maintenance and cost-related issues. Even available, traditional detectors are still being used for analysis of residues in agricultural commodities. It is widely known that the quick, easy, cheap, effective, rugged, and safe method is incompatible with conventional detectors owing to matrix complexity and low sensitivity. Therefore, modifications using column/cartridge-based solid-phase extraction instead of dispersive solid-phase extraction for cleanup have been applied in most cases to compensate and enable the adaptation of the extraction method to conventional detectors. In gas chromatography, the matrix enhancement effect of some analytes has been observed, which lowers the limit of detection and, therefore, enables gas chromatography to be compatible with the quick, easy, cheap, effective, rugged, and safe extraction method. For liquid chromatography with a UV detector, a combination of column/cartridge-based solid-phase extraction and dispersive solid-phase extraction was found to reduce the matrix interference and increase the sensitivity. A suitable double-layer column/cartridge-based solid-phase extraction might be the perfect solution, instead of a time-consuming combination of column/cartridge-based solid-phase extraction and dispersive solid-phase extraction. Therefore, replacing dispersive solid-phase extraction with column/cartridge-based solid-phase extraction in the cleanup step can make the quick, easy, cheap, effective, rugged, and safe extraction method compatible with traditional detectors for more sensitive, effective, and green analysis. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Cell phones as imaging sensors

    NASA Astrophysics Data System (ADS)

    Bhatti, Nina; Baker, Harlyn; Marguier, Joanna; Berclaz, Jérôme; Süsstrunk, Sabine

    2010-04-01

    Camera phones are ubiquitous, and consumers have been adopting them faster than any other technology in modern history. When connected to a network, though, they are capable of more than just picture taking: Suddenly, they gain access to the power of the cloud. We exploit this capability by providing a series of image-based personal advisory services. These are designed to work with any handset over any cellular carrier using commonly available Multimedia Messaging Service (MMS) and Short Message Service (SMS) features. Targeted at the unsophisticated consumer, these applications must be quick and easy to use, not requiring download capabilities or preplanning. Thus, all application processing occurs in the back-end system (i.e., as a cloud service) and not on the handset itself. Presenting an image to an advisory service in the cloud, a user receives information that can be acted upon immediately. Two of our examples involve color assessment - selecting cosmetics and home décor paint palettes; the third provides the ability to extract text from a scene. In the case of the color imaging applications, we have shown that our service rivals the advice quality of experts. The result of this capability is a new paradigm for mobile interactions - image-based information services exploiting the ubiquity of camera phones.

  20. Short-term depression and transient memory in sensory cortex.

    PubMed

    Gillary, Grant; Heydt, Rüdiger von der; Niebur, Ernst

    2017-12-01

    Persistent neuronal activity is usually studied in the context of short-term memory localized in central cortical areas. Recent studies show that early sensory areas also can have persistent representations of stimuli which emerge quickly (over tens of milliseconds) and decay slowly (over seconds). Traditional positive feedback models cannot explain sensory persistence for at least two reasons: (i) They show attractor dynamics, with transient perturbations resulting in a quasi-permanent change of system state, whereas sensory systems return to the original state after a transient. (ii) As we show, those positive feedback models which decay to baseline lose their persistence when their recurrent connections are subject to short-term depression, a common property of excitatory connections in early sensory areas. Dual time constant network behavior has also been implemented by nonlinear afferents producing a large transient input followed by much smaller steady state input. We show that such networks require unphysiologically large onset transients to produce the rise and decay observed in sensory areas. Our study explores how memory and persistence can be implemented in another model class, derivative feedback networks. We show that these networks can operate with two vastly different time courses, changing their state quickly when new information is coming in but retaining it for a long time, and that these capabilities are robust to short-term depression. Specifically, derivative feedback networks with short-term depression that acts differentially on positive and negative feedback projections are capable of dynamically changing their time constant, thus allowing fast onset and slow decay of responses without requiring unrealistically large input transients.

  1. TESS Data Processing and Quick-look Pipeline

    NASA Astrophysics Data System (ADS)

    Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office

    2018-01-01

    We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.

  2. Exploring the Role of Ad Hoc Grassroots Organizations Providing Humanitarian Aid on Lesvos, Greece.

    PubMed

    Kitching, George Tjensvoll; J Haavik, Hanne; Tandstad, Birgit J; Zaman, Muhammad; Darj, Elisabeth

    2016-11-17

    Syrian refugees displaced into Turkey have attempted high-risk sea migrations to reach safer destinations in Europe, most often initially arriving on the Greek island of Lesvos. These refugees were often in need of basic humanitarian assistance that has been provided in part by a new category of ad hoc grassroots organizations (AHGOs). The aim of this study was to understand the internal and external operations of these AHGOs and their role on Lesvos. The experiences of AHGOs were investigated through a qualitative research design utilizing semi-structured interviews with organization leaders and spokespersons. AHGOs identified through media and social media sources as new Lesvos-specific organizations were purposively invited to complete an interview over phone, Skype or email. Data analysis of the transcribed interviews was performed by Systematic Text Condensation. Forty-one organizations were contacted and 13 interviews were conducted. Most organizations were formed in autumn 2015 responding to the greater influx of refugees and migrants at that time and reported an absence of professional humanitarian agencies providing aid on Lesvos. Three categories emerged from the material. Features of organizations; Features of volunteers and; Evolution of AHGOs. The organizations perceived themselves capable of evaluating needs, mobilizing resources, funding and providing quick response. The volunteers came with limited humanitarian experience and from a wide variety of nationalities and professional backgrounds, and the organizations developed while on Lesvos. Knowledge from our findings of AHGOs response to this complex disaster on Lesvos could be utilized in future catastrophes. We conclude that AHGOs may prove effective at providing humanitarian aid in a surge response when international non-governmental organizations are unable to respond quickly. In future complex disasters AHGOs should be recognized as new humanitarian actors and conditions should be made favourable for their operations.

  3. Estimation of Channel-Forming Discharge and Large-Event Geomorphic Response Using HEC-RAS

    NASA Astrophysics Data System (ADS)

    Hamilton, P.; Strom, K.; Hosseiny, S. M. H.

    2015-12-01

    The goal of the present work was to consider the functionality and applicability of HEC-RAS sediment transport simulations in two situations. The first was as a mode for obtaining quick estimates of the effective discharge, one measure of channel-forming discharge, and the second was as a mode to quickly estimate sediment transport and the commensurate potential erosion and deposition during large flood events. Though there are many other sediment transport and morphodynamic models available, e.g., CCHE1D, Nays2DH, we were interested in using HEC-RAS since this is the model of choice for many regulatory bodies, e.g., FEMA, cities, and counties. This makes using the sediment transport capability of HEC-RAS a natural extension of models that already otherwise exist and are well calibrated. In first looking at the utility of these models, we wanted to estimate the effective discharge of streams. Effective discharge is one way of defining the channel-forming discharge for a stream and is therefore an important parameter in natural channel design and restoration efforts. By running this range of floods, one can easily obtain an estimate for recurrence interval most responsible for moving the majority of sediment over a long time period. Results were compared to data collected within our research group on the Brazos River (TX). Effective discharge is an important estimate, particularly in understanding the equilibrium channel condition. Nevertheless, large floods are contemporaneously catastrophic and understanding their potential effects is desirable. Finally, we performed some sensitivity analysis to better understand the underlying assumptions of the various sediment transport model options and how they might affect the outcome of the aforementioned computations.

  4. An efficient near infrared spectroscopy based on aquaphotomics technique for rapid determining the level of Cadmium in aqueous solution

    NASA Astrophysics Data System (ADS)

    Putra, Alfian; Vassileva, Maria; Santo, Ryoko; Tsenkova, Roumina

    2017-06-01

    Cadmium (Cd) is a common industrial pollutant with long biological half-life, which makes it as a cumulative toxicant. Near-infrared spectroscopy has been successfully used for quick and accurate assessment of Cd content in agricultural materials, but the development of a quick detection method for ground and drinking water samples is equal importance for pollution monitoring. Metals have no absorbance in the NIR spectral range, thus the methods developed so far have focused on detection of metal-organic complexes (move to intro). This study focuses on the use of Aquaphotomics technique to measure Cd in aqueous solutions by analyzing the changes in water spectra that occur due to water-metal interaction. Measurements were performed with Cd (II) in 0.1 M HNO3, in the 680-1090 nm (water second and third overtones) and 1110-1800 nm (water first overtone) spectral regions, and were subjected to partial least-square regression analysis. It was found/determined that A concentration of Cd from 1 mg L-1 to 10 mg L-1 could be predicted by this model with average prediction correlation coefficient of 0.897. The model was tested by perturbations with temperature and other metal presence in the solution. The regression coefficient showed consistent peaks at 728, 752, 770, 780, 1362, 1430,1444, 1472/1474 and 1484 nm under various perturbations, indicating that metal to influence the water spectra. The residual predictive deviation values (RPD) were greater than 2, indicating that the model is appropriate for practical use. The result suggested that this newly proposed approach is capable of detecting metal ion in a much simpler, rapid and reliable way.

  5. Quantification of rapid environmental redox processes with quick-scanning x-ray absorption spectroscopy (Q-XAS)

    PubMed Central

    Ginder-Vogel, Matthew; Landrot, Gautier; Fischel, Jason S.; Sparks, Donald L.

    2009-01-01

    Quantification of the initial rates of environmental reactions at the mineral/water interface is a fundamental prerequisite to determining reaction mechanisms and contaminant transport modeling and predicting environmental risk. Until recently, experimental techniques with adequate time resolution and elemental sensitivity to measure initial rates of the wide variety of environmental reactions were quite limited. Techniques such as electron paramagnetic resonance and Fourier transform infrared spectroscopies suffer from limited elemental specificity and poor sensitivity to inorganic elements, respectively. Ex situ analysis of batch and stirred-flow systems provides high elemental sensitivity; however, their time resolution is inadequate to characterize rapid environmental reactions. Here we apply quick-scanning x-ray absorption spectroscopy (Q-XAS), at sub-second time-scales, to measure the initial oxidation rate of As(III) to As(V) by hydrous manganese(IV) oxide. Using Q-XAS, As(III) and As(V) concentrations were determined every 0.98 s in batch reactions. The initial apparent As(III) depletion rate constants (t < 30 s) measured with Q-XAS are nearly twice as large as rate constants measured with traditional analytical techniques. Our results demonstrate the importance of developing analytical techniques capable of analyzing environmental reactions on the same time scale as they occur. Given the high sensitivity, elemental specificity, and time resolution of Q-XAS, it has many potential applications. They could include measuring not only redox reactions but also dissolution/precipitation reactions, such as the formation and/or reductive dissolution of Fe(III) (hydr)oxides, solid-phase transformations (i.e., formation of layered-double hydroxide minerals), or almost any other reaction occurring in aqueous media that can be measured using x-ray absorption spectroscopy. PMID:19805269

  6. The Eclipsing Binary On-Line Atlas (EBOLA)

    NASA Astrophysics Data System (ADS)

    Bradstreet, D. H.; Steelman, D. P.; Sanders, S. J.; Hargis, J. R.

    2004-05-01

    In conjunction with the upcoming release of \\it Binary Maker 3.0, an extensive on-line database of eclipsing binaries is being made available. The purposes of the atlas are: \\begin {enumerate} Allow quick and easy access to information on published eclipsing binaries. Amass a consistent database of light and radial velocity curve solutions to aid in solving new systems. Provide invaluable querying capabilities on all of the parameters of the systems so that informative research can be quickly accomplished on a multitude of published results. Aid observers in establishing new observing programs based upon stars needing new light and/or radial velocity curves. Encourage workers to submit their published results so that others may have easy access to their work. Provide a vast but easily accessible storehouse of information on eclipsing binaries to accelerate the process of understanding analysis techniques and current work in the field. \\end {enumerate} The database will eventually consist of all published eclipsing binaries with light curve solutions. The following information and data will be supplied whenever available for each binary: original light curves in all bandpasses, original radial velocity observations, light curve parameters, RA and Dec, V-magnitudes, spectral types, color indices, periods, binary type, 3D representation of the system near quadrature, plots of the original light curves and synthetic models, plots of the radial velocity observations with theoretical models, and \\it Binary Maker 3.0 data files (parameter, light curve, radial velocity). The pertinent references for each star are also given with hyperlinks directly to the papers via the NASA Abstract website for downloading, if available. In addition the Atlas has extensive searching options so that workers can specifically search for binaries with specific characteristics. The website has more than 150 systems already uploaded. The URL for the site is http://ebola.eastern.edu/.

  7. A Quick and Easy Simplification of Benzocaine's NMR Spectrum

    NASA Astrophysics Data System (ADS)

    Carpenter, Suzanne R.; Wallace, Richard H.

    2006-04-01

    The preparation of benzocaine is a common experiment used in sophomore-level organic chemistry. Its straightforward procedure and predictable good yields make it ideal for the beginning organic student. Analysis of the product via NMR spectroscopy, however, can be confusing to the novice interpreter. An inexpensive, quick, and effective method for simplifying the NMR spectrum is reported. The method results in a spectrum that is cleanly integrated and more easily interpreted.

  8. General MACOS Interface for Modeling and Analysis for Controlled Optical Systems

    NASA Technical Reports Server (NTRS)

    Sigrist, Norbert; Basinger, Scott A.; Redding, David C.

    2012-01-01

    The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.

  9. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  10. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  11. A pressure core ultrasonic test system for on-board analysis of gas hydrate-bearing sediments under in situ pressures.

    PubMed

    Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng

    2018-05-01

    The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.

  12. A pressure core ultrasonic test system for on-board analysis of gas hydrate-bearing sediments under in situ pressures

    NASA Astrophysics Data System (ADS)

    Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng

    2018-05-01

    The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.

  13. Management of a patient's gait abnormality using smartphone technology in-clinic for improved qualitative analysis: A case report.

    PubMed

    VanWye, William R; Hoover, Donald L

    2018-05-01

    Qualitative analysis has its limitations as the speed of human movement often occurs more quickly than can be comprehended. Digital video allows for frame-by-frame analysis, and therefore likely more effective interventions for gait dysfunction. Although the use of digital video outside laboratory settings, just a decade ago, was challenging due to cost and time constraints, rapid use of smartphones and software applications has made this technology much more practical for clinical usage. A 35-year-old man presented for evaluation with the chief complaint of knee pain 24 months status-post triple arthrodesis following a work-related crush injury. In-clinic qualitative gait analysis revealed gait dysfunction, which was augmented by using a standard IPhone® 3GS camera. After video capture, an IPhone® application (Speed Up TV®, https://itunes.apple.com/us/app/speeduptv/id386986953?mt=8 ) allowed for frame-by-frame analysis. Corrective techniques were employed using in-clinic equipment to develop and apply a temporary heel-to-toe rocker sole (HTRS) to the patient's shoe. Post-intervention video revealed significantly improved gait efficiency with a decrease in pain. The patient was promptly fitted with a permanent HTRS orthosis. This intervention enabled the patient to successfully complete a work conditioning program and progress to job retraining. Video allows for multiple views, which can be further enhanced by using applications for frame-by-frame analysis and zoom capabilities. This is especially useful for less experienced observers of human motion, as well as for establishing comparative signs prior to implementation of training and/or permanent devices.

  14. Rapid architecture alternative modeling (RAAM): A framework for capability-based analysis of system of systems architectures

    NASA Astrophysics Data System (ADS)

    Iacobucci, Joseph V.

    The research objective for this manuscript is to develop a Rapid Architecture Alternative Modeling (RAAM) methodology to enable traceable Pre-Milestone A decision making during the conceptual phase of design of a system of systems. Rather than following current trends that place an emphasis on adding more analysis which tends to increase the complexity of the decision making problem, RAAM improves on current methods by reducing both runtime and model creation complexity. RAAM draws upon principles from computer science, system architecting, and domain specific languages to enable the automatic generation and evaluation of architecture alternatives. For example, both mission dependent and mission independent metrics are considered. Mission dependent metrics are determined by the performance of systems accomplishing a task, such as Probability of Success. In contrast, mission independent metrics, such as acquisition cost, are solely determined and influenced by the other systems in the portfolio. RAAM also leverages advances in parallel computing to significantly reduce runtime by defining executable models that are readily amendable to parallelization. This allows the use of cloud computing infrastructures such as Amazon's Elastic Compute Cloud and the PASTEC cluster operated by the Georgia Institute of Technology Research Institute (GTRI). Also, the amount of data that can be generated when fully exploring the design space can quickly exceed the typical capacity of computational resources at the analyst's disposal. To counter this, specific algorithms and techniques are employed. Streaming algorithms and recursive architecture alternative evaluation algorithms are used that reduce computer memory requirements. Lastly, a domain specific language is created to provide a reduction in the computational time of executing the system of systems models. A domain specific language is a small, usually declarative language that offers expressive power focused on a particular problem domain by establishing an effective means to communicate the semantics from the RAAM framework. These techniques make it possible to include diverse multi-metric models within the RAAM framework in addition to system and operational level trades. A canonical example was used to explore the uses of the methodology. The canonical example contains all of the features of a full system of systems architecture analysis study but uses fewer tasks and systems. Using RAAM with the canonical example it was possible to consider both system and operational level trades in the same analysis. Once the methodology had been tested with the canonical example, a Suppression of Enemy Air Defenses (SEAD) capability model was developed. Due to the sensitive nature of analyses on that subject, notional data was developed. The notional data has similar trends and properties to realistic Suppression of Enemy Air Defenses data. RAAM was shown to be traceable and provided a mechanism for a unified treatment of a variety of metrics. The SEAD capability model demonstrated lower computer runtimes and reduced model creation complexity as compared to methods currently in use. To determine the usefulness of the implementation of the methodology on current computing hardware, RAAM was tested with system of system architecture studies of different sizes. This was necessary since system of systems may be called upon to accomplish thousands of tasks. It has been clearly demonstrated that RAAM is able to enumerate and evaluate the types of large, complex design spaces usually encountered in capability based design, oftentimes providing the ability to efficiently search the entire decision space. The core algorithms for generation and evaluation of alternatives scale linearly with expected problem sizes. The SEAD capability model outputs prompted the discovery a new issue, the data storage and manipulation requirements for an analysis. Two strategies were developed to counter large data sizes, the use of portfolio views and top 'n' analysis. This proved the usefulness of the RAAM framework and methodology during Pre-Milestone A capability based analysis. (Abstract shortened by UMI.).

  15. An analytical method to determine ground water supply well network designs.

    PubMed

    MacMillan, Gordon James

    2009-01-01

    An analytical method is provided where the ground water practitioner can quickly determine the size (number of wells) and spacing of a well network capable of meeting a known ground water demand. In order to apply the method, two new parameters are derived that relate theoretical drawdown to the maximum drawdown that is achievable without mining the aquifer. The size of a well network is shown to be proportional to the ground water demand and inversely proportional to the transmissivity and available head. The spacing between wells in a supply well network is shown to be most sensitive to a derived parameter r(HA/3) , which is related to the available head and the propagation of drawdown away from a theoretical well if the total ground water demand was applied to that well. The method can be used to quickly determine the required spacing between wells in well networks of various sizes that are completed in confined aquifers with no leakance. Copyright © 2009 The Author(s). Journal Compilation © 2009 National Ground Water Association.

  16. Quick reproduction of blast-wave flow-field properties of nuclear, TNT, and ANFO explosions

    NASA Astrophysics Data System (ADS)

    Groth, C. P. T.

    1986-04-01

    In many instances, extensive blast-wave flow-field properties are required in gasdynamics research studies of blast-wave loading and structure response, and in evaluating the effects of explosions on their environment. This report provides a very useful computer code, which can be used in conjunction with the DNA Nuclear Blast Standard subroutines and code, to quickly reconstruct complete and fairly accurate blast-wave data for almost any free-air (spherical) and surface-burst (hemispherical) nuclear, trinitrotoluene (TNT), or ammonium nitrate-fuel oil (ANFO) explosion. This code is capable of computing all of the main flow properties as functions of radius and time, as well as providing additional information regarding air viscosity, reflected shock-wave properties, and the initial decay of the flow properties just behind the shock front. Both spatial and temporal distributions of the major blast-wave flow properties are also made readily available. Finally, provisions are also included in the code to provide additional information regarding the peak or shock-front flow properties over a range of radii, for a specific explosion of interest.

  17. An intelligent planning and scheduling system for the HST servicing missions

    NASA Technical Reports Server (NTRS)

    Johnson, Jay; Bogovich, Lynn; Tuchman, Alan; Kispert, Andrew; Page, Brenda; Burkhardt, Christian; Littlefield, Ronald; Mclean, David; Potter, William; Ochs, William

    1993-01-01

    A new, intelligent planning and scheduling system has been delivered to NASA-Goddard Space Flight Center (GSFC) to provide support for the up-coming Hubble Space Telescope (HST) Servicing Missions. This new system is the Servicing Mission Planning and Replanning Tool (SM/PART). SM/PART is written in C and runs on a UNlX-based workstation (IBM RS/6000) under Motif. SM/PART effectively automates the complex task of building or rebuilding integrated timelines and command plans which are required by HST Servicing Mission personnel at their consoles during the missions. SM/PART is able to quickly build or rebuild timelines based on information stored in a Knowledge Base (KB) by using an Artificial Intelligence (AI) tool called the Planning And Resource Reasoning (PARR) shell. After a timeline has been built in the batch mode, it can be displayed and edited in an interactive mode with help from the PARR shell. Finally a detailed command plan is generated. The capability to quickly build or rebuild timelines and command plans provides an additional safety factor for the HST, Shuttle and Crew.

  18. Fast Beam-Based BPM Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertsche, K.; Loos, H.; Nuhn, H.-D.

    2012-10-15

    The Alignment Diagnostic System (ADS) of the LCLS undulator system indicates that the 33 undulator quadrupoles have extremely high position stability over many weeks. However, beam trajectory straightness and lasing efficiency degrade more quickly than this. A lengthy Beam Based Alignment (BBA) procedure must be executed every two to four weeks to re-optimize the X-ray beam parameters. The undulator system includes RF cavity Beam Position Monitors (RFBPMs), several of which are utilized by an automatic feedback system to align the incoming electron-beam trajectory to the undulator axis. The beam trajectory straightness degradation has been traced to electronic drifts of themore » gain and offset of the BPMs used in the beam feedback system. To quickly recover the trajectory straightness, we have developed a fast beam-based procedure to recalibrate the BPMs. This procedure takes advantage of the high-precision monitoring capability of the ADS, which allows highly repeatable positioning of undulator quadrupoles. This report describes the ADS, the position stability of the LCLS undulator quadrupoles, and some results of the new recovery procedure.« less

  19. bioWeb3D: an online webGL 3D data visualisation tool.

    PubMed

    Pettit, Jean-Baptiste; Marioni, John C

    2013-06-07

    Data visualization is critical for interpreting biological data. However, in practice it can prove to be a bottleneck for non trained researchers; this is especially true for three dimensional (3D) data representation. Whilst existing software can provide all necessary functionalities to represent and manipulate biological 3D datasets, very few are easily accessible (browser based), cross platform and accessible to non-expert users. An online HTML5/WebGL based 3D visualisation tool has been developed to allow biologists to quickly and easily view interactive and customizable three dimensional representations of their data along with multiple layers of information. Using the WebGL library Three.js written in Javascript, bioWeb3D allows the simultaneous visualisation of multiple large datasets inputted via a simple JSON, XML or CSV file, which can be read and analysed locally thanks to HTML5 capabilities. Using basic 3D representation techniques in a technologically innovative context, we provide a program that is not intended to compete with professional 3D representation software, but that instead enables a quick and intuitive representation of reasonably large 3D datasets.

  20. ZnO-Based Microfluidic pH Sensor: A Versatile Approach for Quick Recognition of Circulating Tumor Cells in Blood.

    PubMed

    Mani, Ganesh Kumar; Morohoshi, Madoka; Yasoda, Yutaka; Yokoyama, Sho; Kimura, Hiroshi; Tsuchiya, Kazuyoshi

    2017-02-15

    The present study is concerned about the development of highly sensitive and stable microfluidic pH sensor for possible identification of circulating tumor cells (CTCs) in blood. The precise pH measurements between silver-silver chloride (Ag/AgCl) reference electrode and zinc oxide (ZnO) working electrode have been investigated in the microfluidic device. Since there is a direct link between pH and cancer cells, the developed device is one of the valuable tools to examine circulating tumor cells (CTCs) in blood. The ZnO-based working electrode was deposited by radio frequency (rf) sputtering technique. The potential voltage difference between the working and reference electrodes (Ag/AgCl) is evaluated on the microfluidic device. The ideal Nernstian response of -43.71165 mV/pH was achieved along with high stability and quick response time. Finally, to evaluate the real time capability of the developed microfluidic device, in vitro testing was done with A549, A7r5, and MDCK cells.

  1. An evaluation of the effect of JPEG, JPEG2000, and H.264/AVC on CQR codes decoding process

    NASA Astrophysics Data System (ADS)

    Vizcarra Melgar, Max E.; Farias, Mylène C. Q.; Zaghetto, Alexandre

    2015-02-01

    This paper presents a binarymatrix code based on QR Code (Quick Response Code), denoted as CQR Code (Colored Quick Response Code), and evaluates the effect of JPEG, JPEG2000 and H.264/AVC compression on the decoding process. The proposed CQR Code has three additional colors (red, green and blue), what enables twice as much storage capacity when compared to the traditional black and white QR Code. Using the Reed-Solomon error-correcting code, the CQR Code model has a theoretical correction capability of 38.41%. The goal of this paper is to evaluate the effect that degradations inserted by common image compression algorithms have on the decoding process. Results show that a successful decoding process can be achieved for compression rates up to 0.3877 bits/pixel, 0.1093 bits/pixel and 0.3808 bits/pixel for JPEG, JPEG2000 and H.264/AVC formats, respectively. The algorithm that presents the best performance is the H.264/AVC, followed by the JPEG2000, and JPEG.

  2. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-10-01

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  3. Competing on capabilities: the new rules of corporate strategy.

    PubMed

    Stalk, G; Evans, P; Shulman, L E

    1992-01-01

    In the 1980s, companies discovered time as a new source of competitive advantage. In the 1990s, they will discover that time is only one piece of a more far-reaching transformation in the logic of competition. Using examples from Wal-Mart and other highly successful companies, Stalk, Evans, and Shulman of the Boston Consulting Group provide managers with a guide to the new world of "capabilities-based competition." In today's dynamic business environment, strategy too must become dynamic. Competition is a "war of movement" in which success depends on anticipation of market trends and quick response to changing customer needs. In such an environment, the essence of strategy is not the structure of a company's products and markets but the dynamics of its behavior. To succeed, a company must weave its key business processes into hard-to-imitate strategic capabilities that distinguish it from its competitors in the eyes of customers. A capability is a set of business processes strategically understood--for example, Wal-Mart's expertise in inventory replenishment, Honda's skill at dealer management, or Banc One's ability to "out-local the national banks and out-national the local banks." Such capabilities are collective and cross-functional--a small part of many people's jobs, not a large part of a few. Finally, competing on capabilities requires strategic investments in support systems that span traditional SBUs and functions and go far beyond what traditional cost-benefit metrics can justify. A CEO's success in building and managing a company's capabilities will be the chief test of management skill in the 1990s. The prize: companies that combine scale and flexibility to outperform the competition.

  4. Delivering informatics capabilities to an AHC research community through public/private partnerships (PPP).

    PubMed

    Smith, Kevin A; Athey, Brian D; Chahal, Amar P S; Sahai, Priti

    2008-11-06

    Velos eResearch is a commercially-developed, regulatory-compliant, web-based clinical research information system from Velos Inc. Aithent Inc. is a software development services company. The University of Michigan (UM) has public/private partnerships with Velos and Aithent to collaborate on development of additional capabilities, modules, and new products to better support the needs of clinical and translational research communities. These partnerships provide UM with a mechanism for obtaining high-quality functionally comprehensive capabilities more quickly and at lower cost, while the corporate partners get a quality advisory and development partner--this benefits all parties. The UM chose to partner with Velos in part because of its commitment to interoperability. Velos is an active participant in the NCI caBIG project and is committed to caBIG compatibility. Velos already provides interoperability with other Velos sites in the CTSA context. One example of the partnership is co-development of integrated specimen management capabilities. UM spent more than a year defining business requirements and technical specifications for, and is funding development of, this capability. UM also facilitates an autonomous user community (20+ institutions, 7 CTSA awardees); the broad goal of the group is to share experiences, expertise, identify collaborative opportunities, and support one another as well as provide a source of future needs identification to Velos. Advantages and risks related to delivering informatics capabilities to an AHC research community through a public/private partnership will be presented. The UM, Velos and Aithent will discuss frameworks, agreements and other factors that have contributed to a successful partnership.

  5. MEG and EEG data analysis with MNE-Python.

    PubMed

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-12-26

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne.

  6. Recurrence plot for parameters analysing of internal combustion engine

    NASA Astrophysics Data System (ADS)

    Alexa, O.; Ilie, C. O.; Marinescu, M.; Vilau, R.; Grosu, D.

    2015-11-01

    In many technical disciplines modem data analysis techniques has been successfully applied to understand the complexity of the system. The growing volume of theoretical knowledge about systems dynamic's offered researchers the opportunity to look for non-linear dynamics in data whose evolution linear models are unable to explain in a satisfactory manner. One approach in this respect is Recurrence Analysis - RA which is a graphical method designed to locate hidden recurring patterns, nonstationarity and structural changes. RA approach arose in natural sciences like physics and biology but quickly was adopted in economics and engineering. Meanwhile. The fast development of computer resources has provided powerful tools to perform this new and complex model. One free software which was used to perform our analysis is Visual Recurrence Analysis - VRA developed by Eugene Kononov. As is presented in this paper, the recurrence plot investigation for the analyzing of the internal combustion engine shows some of the RPA capabilities in this domain. We chose two specific engine parameters measured in two different tests to perform the RPA. These parameters are injection impulse width and engine angular speed and the tests are I11n and I51n. There were computed graphs for each of them. Graphs were analyzed and compared to obtain a conclusion. This work is an incipient research, being one of the first attempts of using recurrence plot for analyzing automotive dynamics. It opens a wide field of action for future research programs.

  7. MEG and EEG data analysis with MNE-Python

    PubMed Central

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A.; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne. PMID:24431986

  8. Data Scientists ARE coming of age: but WHERE are they coming from?

    NASA Astrophysics Data System (ADS)

    Evans, N.; Bastrakova, I.; Connor, N.; Raymond, O.; Wyborn, L. A.

    2013-12-01

    The fourth paradigm of data intensive science is upon us: a new fundamental scientific methodology has emerged which is underpinned by the capability to analyse large volumes of data using advanced computational capacities. This combination is enabling earth and space scientists to respond to decadal challenges on issues such as the sustainable development of our natural resources, impacts of climate change and protection from national hazards. Fundamental to the data intensive paradigm is data that are readily accessible and capable of being integrated and amalgamated with other data often from multiple sources. For many years Earth and Space science practitioners have been drowning in a data deluge. In many cases, either lacking confidence in their capability and/or not having the time or capacity to manage these data assets they have called in the data professionals. However, such people rarely had domain knowledge of the data they were dealing with and before long it emerged that although the ';containers' of data were now much better managed and documented, in reality the content was locked up and difficult to access, particularly for HPC environments where national to global scale problems were being addressed. Geoscience Australia (GA) is the custodian of over 4 PB of Geoscientific data and is a key provider of evidence-based, scientific advice to government on national issues. Since 2011, in collaboration with CSIRO Minerals Down Under Program, and the National Computational Infrastructure, GA has begun a series of data intensive scientific research pilots that focussed on applying advanced ICT tools and technologies to enhance scientific outcomes for the agency, in particular, national scale analysis of data sets that can be up to 500 TB in size. As in any change program, a small group of innovators and early adopters took up the challenge of data intensive science and quickly showed that GA was able to use new ICT technologies to exploit an information-rich world to undertake applied research and to deliver new business outcomes in ways that current technologies do not allow. The innovators clearly had the necessary skills to rapidly adapt to data intensive techniques. However, if we were to scale out to the rest of the organisation, we needed to quantify these skills. The Strategic People Development Section of GA agreed to: * Conduct a capability analysis of the scientific staff that participated in the pilot projects including a review of university training and post graduate training; and * Conduct capability analysis of the technical groups involved in the pilot projects. The analysis identified the need for multi-disciplinary teams across the spectrum from pure scientists to pure ICT staff along with a key hybrid role - the Data Scientist, who has a greater capacity in mathematical, numerical modelling, statistics, computational skills, software engineering and spatial skills and the ability to integrate data across multiple domains. To fill the emerging gap, GA is asking the questions; how do we find or develop this capability, can we successfully transform the Scientist or the ICT Professional, are our educational facilities modifying their training - but it is certainly leading GA to acknowledge, formalise, and promote a continuum of skills and roles, changing our recruitment, re-assignment and Learning and Development strategic decisions.

  9. Reduced caudate volume and enhanced striatal-DMN integration in chess experts.

    PubMed

    Duan, Xujun; He, Sheng; Liao, Wei; Liang, Dongmei; Qiu, Lihua; Wei, Luqing; Li, Yuan; Liu, Chengyi; Gong, Qiyong; Chen, Huafu

    2012-04-02

    The superior capability of chess experts largely depends on quick automatic processing skills which are considered to be mediated by the caudate nucleus. We asked whether continued practice or rehearsal of the skill over a long period of time can lead to structural changes in this region. We found that, comparing to novice controls, grandmaster and master level Chinese chess players (GM/Ms), who had a mean period of over 10years of tournament and training practice, exhibited significant smaller gray-matter volume in the bilateral caudate nuclei. When these regions were used as seeds in functional connectivity analysis in resting-state fMRI, significantly enhanced integration was found in GM/Ms between the caudate and the default mode network (DMN), a constellation of brain areas important for goal-directed cognitive performance and theory of mind. These findings demonstrate the structural changes in the caudate nucleus in response to its extensive engagement in chess problem solving, and its enhanced functional integration with widely distributed circuitry to better support high-level cognitive control of behavior. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Adaptation and Re-Use of Spacecraft Power System Models for the Constellation Program

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Kerslake, Thomas W.; Ayres, Mark; Han, Augustina H.; Adamson, Adrian M.

    2008-01-01

    NASA's Constellation Program is embarking on a new era of space exploration, returning to the Moon and beyond. The Constellation architecture will consist of a number of new spacecraft elements, including the Orion crew exploration vehicle, the Altair lunar lander, and the Ares family of launch vehicles. Each of these new spacecraft elements will need an electric power system, and those power systems will need to be designed to fulfill unique mission objectives and to survive the unique environments encountered on a lunar exploration mission. As with any new spacecraft power system development, preliminary design work will rely heavily on analysis to select the proper power technologies, size the power system components, and predict the system performance throughout the required mission profile. Constellation projects have the advantage of leveraging power system modeling developments from other recent programs such as the International Space Station (ISS) and the Mars Exploration Program. These programs have developed mature power system modeling tools, which can be quickly modified to meet the unique needs of Constellation, and thus provide a rapid capability for detailed power system modeling that otherwise would not exist.

  11. An automated data exploitation system for airborne sensors

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike

    2014-06-01

    Advanced wide area persistent surveillance (WAPS) sensor systems on manned or unmanned airborne vehicles are essential for wide-area urban security monitoring in order to protect our people and our warfighter from terrorist attacks. Currently, human (imagery) analysts process huge data collections from full motion video (FMV) for data exploitation and analysis (real-time and forensic), providing slow and inaccurate results. An Automated Data Exploitation System (ADES) is urgently needed. In this paper, we present a recently developed ADES for airborne vehicles under heavy urban background clutter conditions. This system includes four processes: (1) fast image registration, stabilization, and mosaicking; (2) advanced non-linear morphological moving target detection; (3) robust multiple target (vehicles, dismounts, and human) tracking (up to 100 target tracks); and (4) moving or static target/object recognition (super-resolution). Test results with real FMV data indicate that our ADES can reliably detect, track, and recognize multiple vehicles under heavy urban background clutters. Furthermore, our example shows that ADES as a baseline platform can provide capability for vehicle abnormal behavior detection to help imagery analysts quickly trace down potential threats and crimes.

  12. Rapid determination of enantiomeric excess: a focus on optical approaches.

    PubMed

    Leung, Diana; Kang, Sung Ok; Anslyn, Eric V

    2012-01-07

    High-throughput screening (HTS) methods are becoming increasingly essential in discovering chiral catalysts or auxiliaries for asymmetric transformations due to the advent of parallel synthesis and combinatorial chemistry. Both parallel synthesis and combinatorial chemistry can lead to the exploration of a range of structural candidates and reaction conditions as a means to obtain the highest enantiomeric excess (ee) of a desired transformation. One current bottleneck in these approaches to asymmetric reactions is the determination of ee, which has led researchers to explore a wide range of HTS techniques. To be truly high-throughput, it has been proposed that a technique that can analyse a thousand or more samples per day is needed. Many of the current approaches to this goal are based on optical methods because they allow for a rapid determination of ee due to quick data collection and their parallel analysis capabilities. In this critical review these techniques are reviewed with a discussion of their respective advantages and drawbacks, and with a contrast to chromatographic methods (180 references). This journal is © The Royal Society of Chemistry 2012

  13. Fault detection and fault tolerance in robotics

    NASA Technical Reports Server (NTRS)

    Visinsky, Monica; Walker, Ian D.; Cavallaro, Joseph R.

    1992-01-01

    Robots are used in inaccessible or hazardous environments in order to alleviate some of the time, cost and risk involved in preparing men to endure these conditions. In order to perform their expected tasks, the robots are often quite complex, thus increasing their potential for failures. If men must be sent into these environments to repair each component failure in the robot, the advantages of using the robot are quickly lost. Fault tolerant robots are needed which can effectively cope with failures and continue their tasks until repairs can be realistically scheduled. Before fault tolerant capabilities can be created, methods of detecting and pinpointing failures must be perfected. This paper develops a basic fault tree analysis of a robot in order to obtain a better understanding of where failures can occur and how they contribute to other failures in the robot. The resulting failure flow chart can also be used to analyze the resiliency of the robot in the presence of specific faults. By simulating robot failures and fault detection schemes, the problems involved in detecting failures for robots are explored in more depth.

  14. A wireless fatigue monitoring system utilizing a bio-inspired tree ring data tracking technique.

    PubMed

    Bai, Shi; Li, Xuan; Xie, Zhaohui; Zhou, Zhi; Ou, Jinping

    2014-03-05

    Fatigue, a hot scientific research topic for centuries, can trigger sudden failure of critical structures such as aircraft and railway systems, resulting in enormous casualties as well as economic losses. The fatigue life of certain structures is intrinsically random and few monitoring techniques are capable of tracking the full life-cycle fatigue damage. In this paper, a novel in-situ wireless real-time fatigue monitoring system using a bio-inspired tree ring data tracking technique is proposed. The general framework, methodology, and verification of this intelligent system are discussed in details. The rain-flow counting (RFC) method is adopted as the core algorithm which quantifies fatigue damages, and Digital Signal Processing (DSP) is introduced as the core module for data collection and analysis. Laboratory test results based on strain gauges and polyvinylidene fluoride (PVDF) sensors have shown that the developed intelligent system can provide a reliable quick feedback and early warning of fatigue failure. With the merits of low cost, high accuracy and great reliability, the developed wireless fatigue sensing system can be further applied to mechanical engineering, civil infrastructures, transportation systems, aerospace engineering, etc.

  15. Use of laser 3D surface digitizer in data collection and 3D modeling of anatomical structures

    NASA Astrophysics Data System (ADS)

    Tse, Kelly; Van Der Wall, Hans; Vu, Dzung H.

    2006-02-01

    A laser digitizer (Konica-Minolta Vivid 910) is used to obtain 3-dimensional surface scans of anatomical structures with a maximum resolution of 0.1mm. Placing the specimen on a turntable allows multiple scans allaround because the scanner only captures data from the portion facing its lens. A computer model is generated using 3D modeling software such as Geomagic. The 3D model can be manipulated on screen for repeated analysis of anatomical features, a useful capability when the specimens are rare or inaccessible (museum collection, fossils, imprints in rock formation.). As accurate measurements can be performed on the computer model, instead of taking measurements on actual specimens only at the archeological excavation site e.g., a variety of quantitative data can be later obtained on the computer model in the laboratory as new ideas come to mind. Our group had used a mechanical contact digitizer (Microscribe) for this purpose, but with the surface digitizer, we have been obtaining data sets more accurately and more quickly.

  16. Semi-quantitative MALDI-TOF for antimicrobial susceptibility testing in Staphylococcus aureus.

    PubMed

    Maxson, Tucker; Taylor-Howell, Cheryl L; Minogue, Timothy D

    2017-01-01

    Antibiotic resistant bacterial infections are a significant problem in the healthcare setting, in many cases requiring the rapid administration of appropriate and effective antibiotic therapy. Diagnostic assays capable of quickly and accurately determining the pathogen resistance profile are therefore crucial to initiate or modify care. Matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) is a standard method for species identification in many clinical microbiology laboratories and is well positioned to be applied towards antimicrobial susceptibility testing. One recently reported approach utilizes semi-quantitative MALDI-TOF MS for growth rate analysis to provide a resistance profile independent of resistance mechanism. This method was previously successfully applied to Gram-negative pathogens and mycobacteria; here, we evaluated this method with the Gram-positive pathogen Staphylococcus aureus. Specifically, we used 35 strains of S. aureus and four antibiotics to optimize and test the assay, resulting in an overall accuracy rate of 95%. Application of the optimized assay also successfully determined susceptibility from mock blood cultures, allowing both species identification and resistance determination for all four antibiotics within 3 hours of blood culture positivity.

  17. Optical aptasensors for quantitative detection of small biomolecules: a review.

    PubMed

    Feng, Chunjing; Dai, Shuang; Wang, Lei

    2014-09-15

    Aptasensors are aptamer-based biosensors with excellent recognition capability towards a wide range of targets. Specially, there have been ever-growing interests in the development of aptasensors for the detection of small molecules. This phenomenon is contributed to two reasons. On one hand, small biomolecules play an important role in living organisms with many kinds of biological function, such as antiarrhythmic effect and vasodilator activity of adenosine. On the other hand, the concentration of small molecules can be an indicator for disease diagnosis, for example, the concentration of ATP is closely associated with cell injury and cell viability. As a potential analysis tool in the construction of aptasensors, optical analysis has attracted much more interest of researchers due to its high sensitivity, quick response and simple operation. Besides, it promises the promotion of aptasensors in performance toward a new level. Review the development of optical aptasensors for small biomolecules will give readers an overall understanding of its progress and provide some theoretical guidelines for its future development. Hence, we give a mini-review on the advance of optical aptasensors for small biomolecules. This review focuses on recent achievements in the design of various optical aptasensors for small biomolecules, containing fluorescence aptasensors, colorimetric aptasensors, chemiluminescence aptasensors and other optical aptasensors. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Analysis of fast boundary-integral approximations for modeling electrostatic contributions of molecular binding

    PubMed Central

    Kreienkamp, Amelia B.; Liu, Lucy Y.; Minkara, Mona S.; Knepley, Matthew G.; Bardhan, Jaydeep P.; Radhakrishnan, Mala L.

    2013-01-01

    We analyze and suggest improvements to a recently developed approximate continuum-electrostatic model for proteins. The model, called BIBEE/I (boundary-integral based electrostatics estimation with interpolation), was able to estimate electrostatic solvation free energies to within a mean unsigned error of 4% on a test set of more than 600 proteins—a significant improvement over previous BIBEE models. In this work, we tested the BIBEE/I model for its capability to predict residue-by-residue interactions in protein–protein binding, using the widely studied model system of trypsin and bovine pancreatic trypsin inhibitor (BPTI). Finding that the BIBEE/I model performs surprisingly less well in this task than simpler BIBEE models, we seek to explain this behavior in terms of the models’ differing spectral approximations of the exact boundary-integral operator. Calculations of analytically solvable systems (spheres and tri-axial ellipsoids) suggest two possibilities for improvement. The first is a modified BIBEE/I approach that captures the asymptotic eigenvalue limit correctly, and the second involves the dipole and quadrupole modes for ellipsoidal approximations of protein geometries. Our analysis suggests that fast, rigorous approximate models derived from reduced-basis approximation of boundary-integral equations might reach unprecedented accuracy, if the dipole and quadrupole modes can be captured quickly for general shapes. PMID:24466561

  19. Rapid monitoring of grape withering using visible near-infrared spectroscopy.

    PubMed

    Beghi, Roberto; Giovenzana, Valentina; Marai, Simone; Guidetti, Riccardo

    2015-12-01

    Wineries need new practical and quick instruments, non-destructive and able to quantitatively evaluate during withering the parameters that impact product quality. The aim of the work was to test an optical portable system (visible near-infrared (NIR) spectrophotometer) in a wavelength range of 400-1000 nm for the prediction of quality parameters of grape berries during withering. A total of 300 red grape samples (Vitis vinifera L., Corvina cultivar) harvested in vintage year 2012 from the Valpolicella area (Verona, Italy) were analyzed. Qualitative (principal component analysis, PCA) and quantitative (partial least squares regression algorithm, PLS) evaluations were performed on grape spectra. PCA showed a clear sample grouping for the different withering stages. PLS models gave encouraging predictive capabilities for soluble solids content (R(2) val  = 0.62 and ratio performance deviation, RPD = 1.87) and firmness (R(2) val  = 0.56 and RPD = 1.79). The work demonstrated the applicability of visible NIR spectroscopy as a rapid technique for the analysis of grape quality directly in barns, during withering. The sector could be provided with simple and inexpensive optical systems that could be used to monitor the withering degree of grape for better management of the wine production process. © 2014 Society of Chemical Industry.

  20. Effect of Clouds on Optical Imaging of the Space Shuttle During the Ascent Phase: A Statistical Analysis Based on a 3D Model

    NASA Technical Reports Server (NTRS)

    Short, David A.; Lane, Robert E., Jr.; Winters, Katherine A.; Madura, John T.

    2004-01-01

    Clouds are highly effective in obscuring optical images of the Space Shuttle taken during its ascent by ground-based and airborne tracking cameras. Because the imagery is used for quick-look and post-flight engineering analysis, the Columbia Accident Investigation Board (CAIB) recommended the return-to-flight effort include an upgrade of the imaging system to enable it to obtain at least three useful views of the Shuttle from lift-off to at least solid rocket booster (SRB) separation (NASA 2003). The lifetimes of individual cloud elements capable of obscuring optical views of the Shuttle are typically 20 minutes or less. Therefore, accurately observing and forecasting cloud obscuration over an extended network of cameras poses an unprecedented challenge for the current state of observational and modeling techniques. In addition, even the best numerical simulations based on real observations will never reach "truth." In order to quantify the risk that clouds would obscure optical imagery of the Shuttle, a 3D model to calculate probabilistic risk was developed. The model was used to estimate the ability of a network of optical imaging cameras to obtain at least N simultaneous views of the Shuttle from lift-off to SRB separation in the presence of an idealized, randomized cloud field.

  1. GPR image analysis to locate water leaks from buried pipes by applying variance filters

    NASA Astrophysics Data System (ADS)

    Ocaña-Levario, Silvia J.; Carreño-Alvarado, Elizabeth P.; Ayala-Cabrera, David; Izquierdo, Joaquín

    2018-05-01

    Nowadays, there is growing interest in controlling and reducing the amount of water lost through leakage in water supply systems (WSSs). Leakage is, in fact, one of the biggest problems faced by the managers of these utilities. This work addresses the problem of leakage in WSSs by using GPR (Ground Penetrating Radar) as a non-destructive method. The main objective is to identify and extract features from GPR images such as leaks and components in a controlled laboratory condition by a methodology based on second order statistical parameters and, using the obtained features, to create 3D models that allows quick visualization of components and leaks in WSSs from GPR image analysis and subsequent interpretation. This methodology has been used before in other fields and provided promising results. The results obtained with the proposed methodology are presented, analyzed, interpreted and compared with the results obtained by using a well-established multi-agent based methodology. These results show that the variance filter is capable of highlighting the characteristics of components and anomalies, in an intuitive manner, which can be identified by non-highly qualified personnel, using the 3D models we develop. This research intends to pave the way towards future intelligent detection systems that enable the automatic detection of leaks in WSSs.

  2. Smartphone-Based Food Diagnostic Technologies: A Review.

    PubMed

    Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo

    2017-06-20

    A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies.

  3. Smartphone-Based Food Diagnostic Technologies: A Review

    PubMed Central

    Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo

    2017-01-01

    A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies. PMID:28632188

  4. The characteristics and fulfillment of conditional prescription drug approvals in Canada.

    PubMed

    Law, Michael R

    2014-06-01

    In order to more quickly approve drugs for rare and serious conditions, many countries have developed approval pathways that require companies to fulfill conditions after marketing. This analysis assessed the use and outcomes of Canada's Notice of Compliance with Conditions (NOC/c) program. Two publicly available databases from Health Canada were used to study the characteristics of the drugs approved using a NOC/c. Further, Kaplan-Meier analysis was used to estimate the median time-to-fulfillment for approval conditions. Seventy NOC/c approvals have been made, most commonly for cancer treatments. The conditions of the approvals were only publicly available for 24 of these approvals (34%). Approval conditions were fulfilled for 29 approvals (41%), remained outstanding for 34 (49%), had been revoked for 7 (10%). The median time to the fulfillment of conditions was about five years (1828 days; 95%CI: 1222-2325). Canadians have limited information on why conditional approvals are granted. As drugs are typically marketed for 5 years before conditions are met, better information should be provided to clinicians and patients so they can better understand treatment options. Further, steps to speed the fulfillment of conditions, such as time-limited approvals and the capability to levy financial penalties, should be added to the NOC/c regime. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. A quality evaluation methodology of health web-pages for non-professionals.

    PubMed

    Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro

    2004-06-01

    The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.

  6. Composite Fan Blade Design for Advanced Engine Concepts

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Kuguoglu, Latife H.; Chamis, Christos C.

    2004-01-01

    The aerodynamic and structural viability of composite fan blades of the revolutionary Exo-Skeletal engine are assessed for an advanced subsonic mission using the NASA EST/BEST computational simulation system. The Exo-Skeletal Engine (ESE) calls for the elimination of the shafts and disks completely from the engine center and the attachment of the rotor blades in spanwise compression to a rotating casing. The fan rotor overall adiabatic efficiency obtained from aerodynamic analysis is estimated at 91.6 percent. The flow is supersonic near the blade leading edge but quickly transitions into a subsonic flow without any turbulent boundary layer separation on the blade. The structural evaluation of the composite fan blade indicates that the blade would buckle at a rotor speed that is 3.5 times the design speed of 2000 rpm. The progressive damage analysis of the composite fan blade shows that ply damage is initiated at a speed of 4870 rpm while blade fracture takes place at 7640 rpm. This paper describes and discusses the results for the composite blade that are obtained from aerodynamic, displacement, stress, buckling, modal, and progressive damage analyses. It will be demonstrated that a computational simulation capability is readily available to evaluate new and revolutionary technology such as the ESE.

  7. Further investigations on fixed abrasive diamond pellets used for diminishing mid-spatial frequency errors of optical mirrors.

    PubMed

    Dong, Zhichao; Cheng, Haobo; Tam, Hon-Yuen

    2014-01-20

    As further application investigations on fixed abrasive diamond pellets (FADPs), this work exhibits their potential capability for diminishing mid-spatial frequency errors (MSFEs, i.e., periodic small structure) of optical surfaces. Benefitting from its high surficial rigidness, the FADPs tool has a natural smoothing effect to periodic small errors. Compared with the previous design, this proposed new tool employs more compliance to aspherical surfaces due to the pellets being mutually separated and bonded on a steel plate with elastic back of silica rubber adhesive. Moreover, a unicursal Peano-like path is presented for improving MSFEs, which can enhance the multidirectionality and uniformity of the tool's motion. Experiments were conducted to validate the effectiveness of FADPs for diminishing MSFEs. In the lapping of a Φ=420 mm Zerodur paraboloid workpiece, the grinding ripples were quickly diminished (210 min) by both visual inspection and profile metrology, as well as the power spectrum density (PSD) analysis, RMS was reduced from 4.35 to 0.55 μm. In the smoothing of a Φ=101 mm fused silica workpiece, MSFEs were obviously improved from the inspection of surface form maps, interferometric fringe patterns, and PSD analysis. The mid-spatial frequency RMS was diminished from 0.017λ to 0.014λ (λ=632.8 nm).

  8. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  9. Multiscale methods for gore curvature calculations from FSI modeling of spacecraft parachutes

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kolesar, Ryan; Boswell, Cody; Kanai, Taro; Montel, Kenneth

    2014-12-01

    There are now some sophisticated and powerful methods for computer modeling of parachutes. These methods are capable of addressing some of the most formidable computational challenges encountered in parachute modeling, including fluid-structure interaction (FSI) between the parachute and air flow, design complexities such as those seen in spacecraft parachutes, and operational complexities such as use in clusters and disreefing. One should be able to extract from a reliable full-scale parachute modeling any data or analysis needed. In some cases, however, the parachute engineers may want to perform quickly an extended or repetitive analysis with methods based on simplified models. Some of the data needed by a simplified model can very effectively be extracted from a full-scale computer modeling that serves as a pilot. A good example of such data is the circumferential curvature of a parachute gore, where a gore is the slice of the parachute canopy between two radial reinforcement cables running from the parachute vent to the skirt. We present the multiscale methods we devised for gore curvature calculation from FSI modeling of spacecraft parachutes. The methods include those based on the multiscale sequentially-coupled FSI technique and using NURBS meshes. We show how the methods work for the fully-open and two reefed stages of the Orion spacecraft main and drogue parachutes.

  10. Research on distributed virtual reality system in electronic commerce

    NASA Astrophysics Data System (ADS)

    Xue, Qiang; Wang, Jiening; Sun, Jizhou

    2004-03-01

    In this paper, Distributed Virtual Reality (DVR) technology applied in Electronical Commerce (EC) is discussed. DVR has the capability of providing a new means for human being to recognize, analyze and resolve the large scale, complex problems, which makes it develop quickly in EC fields. The technology of CSCW (Computer Supported Cooperative Work) and middleware is introduced into the development of EC-DVR system to meet the need of a platform which can provide the necessary cooperation and communication services to avoid developing the basic module repeatedly. Finally, the paper gives a platform structure of EC-DVR system.

  11. Integrated green algal technology for bioremediation and biofuel.

    PubMed

    Sivakumar, Ganapathy; Xu, Jianfeng; Thompson, Robert W; Yang, Ying; Randol-Smith, Paula; Weathers, Pamela J

    2012-03-01

    Sustainable non-food energy biomass and cost-effective ways to produce renewable energy technologies from this biomass are continuously emerging. Algae are capable of producing lipids and hydrocarbons quickly and their photosynthetic abilities make them a promising candidate for an alternative energy source. In addition, their favorable carbon life cycle and a renewed focus on rural economic development are attractive factors. In this review the focus is mainly on the integrated approach of algae culture for bioremediation and oil-based biofuel production with mention of possible other value-added benefits of using algae for those purposes. Published by Elsevier Ltd.

  12. Monitoring indicators of harmful cyanobacteria in Texas

    USGS Publications Warehouse

    Kiesling, Richard L.; Gary, Robin H.; Gary, Marcus O.

    2008-01-01

    Harmful algal blooms can occur when certain types of microscopic algae grow quickly in water, forming visible patches that might harm the health of the environment, plants, or animals. In freshwater, species of Cyanobacteria (also known as bluegreen algae) are the dominant group of harmful, bloom-forming algae. When Cyanobacteria form a harmful algal bloom, potential impairments include restricted recreational activities because of algal scums or algal mats, potential loss of public water supply because of taste and odor compounds (for example, geosmin), and the production of toxins (for example, microcystin) in amounts capable of threatening human health and wildlife.

  13. 3D printed microfluidic mixer for point-of-care diagnosis of anemia.

    PubMed

    Plevniak, Kimberly; Campbell, Matthew; Mei He

    2016-08-01

    3D printing has been an emerging fabrication tool in prototyping and manufacturing. We demonstrated a 3D microfluidic simulation guided computer design and 3D printer prototyping for quick turnaround development of microfluidic 3D mixers, which allows fast self-mixing of reagents with blood through capillary force. Combined with smartphone, the point-of-care diagnosis of anemia from finger-prick blood has been successfully implemented and showed consistent results with clinical measurements. Capable of 3D fabrication flexibility and smartphone compatibility, this work presents a novel diagnostic strategy for advancing personalized medicine and mobile healthcare.

  14. Automatic building identification under bomb damage conditions

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Noll, Warren; Barker, Joseph; Wunsch, Donald C., II

    2009-05-01

    Given the vast amount of image intelligence utilized in support of planning and executing military operations, a passive automated image processing capability for target identification is urgently required. Furthermore, transmitting large image streams from remote locations would quickly use available band width (BW) precipitating the need for processing to occur at the sensor location. This paper addresses the problem of automatic target recognition for battle damage assessment (BDA). We utilize an Adaptive Resonance Theory approach to cluster templates of target buildings. The results show that the network successfully classifies targets from non-targets in a virtual test bed environment.

  15. VASCOMP 2. The V/STOL aircraft sizing and performance computer program. Volume 6: User's manual, revision 3

    NASA Technical Reports Server (NTRS)

    Schoen, A. H.; Rosenstein, H.; Stanzione, K.; Wisniewski, J. S.

    1980-01-01

    This report describes the use of the V/STOL Aircraft Sizing and Performance Computer Program (VASCOMP II). The program is useful in performing aircraft parametric studies in a quick and cost efficient manner. Problem formulation and data development were performed by the Boeing Vertol Company and reflects the present preliminary design technology. The computer program, written in FORTRAN IV, has a broad range of input parameters, to enable investigation of a wide variety of aircraft. User oriented features of the program include minimized input requirements, diagnostic capabilities, and various options for program flexibility.

  16. Statement of Rear Admiral Nevin P. Carr, Jr., United States Navy Chief of Naval Research before the Terrorism, Unconventional Threats and Capabilities Subcommittee of the House Armed Services Committee on The Fiscal year 2011 Budget Request

    DTIC Science & Technology

    2010-03-23

    foundation of our S&T portfolio by developing a broad base of scientific knowledge from which INP, FNC, and quick reaction efforts are generated...optimally tailoring experiences, in real-time, to current cognitive and physiological states of the learner. A unique human systems design approach is...efforts include modeling human responses to blast, ballistic, and blunt trauma, as well as modeling physical and cognitive effects of blast exposure and

  17. Genomics dataset of unidentified disclosed isolates.

    PubMed

    Rekadwad, Bhagwan N

    2016-09-01

    Analysis of DNA sequences is necessary for higher hierarchical classification of the organisms. It gives clues about the characteristics of organisms and their taxonomic position. This dataset is chosen to find complexities in the unidentified DNA in the disclosed patents. A total of 17 unidentified DNA sequences were thoroughly analyzed. The quick response codes were generated. AT/GC content of the DNA sequences analysis was carried out. The QR is helpful for quick identification of isolates. AT/GC content is helpful for studying their stability at different temperatures. Additionally, a dataset on cleavage code and enzyme code studied under the restriction digestion study, which helpful for performing studies using short DNA sequences was reported. The dataset disclosed here is the new revelatory data for exploration of unique DNA sequences for evaluation, identification, comparison and analysis.

  18. Reducing the cost of dietary assessment: self-completed recall and analysis of nutrition for use with children (SCRAN24).

    PubMed

    Foster, E; Hawkins, A; Delve, J; Adamson, A J

    2014-01-01

    Self-Completed Recall and Analysis of Nutrition (scran24) is a prototype computerised 24-h recall system for use with 11-16 year olds. It is based on the Multiple Pass 24-h Recall method and includes prompts and checks throughout the system for forgotten food items. The development of scran24 was informed by an extensive literature review, a series of focus groups and usability testing. The first stage of the recall is a quick list where the user is asked to input all the foods and drinks they remember consuming the previous day. The quick list is structured into meals and snacks. Once the quick list is complete, additional information is collected on each food to determine food type and to obtain an estimate of portion size using digital images of food. Foods are located within the system using a free text search, which is linked to the information entered into the quick list. A time is assigned to each eating occasion using drag and drop onto a timeline. The system prompts the user if no foods or drinks have been consumed within a 3-h time frame, or if fewer than three drinks have been consumed throughout the day. The food composition code and weight (g) of all items selected are automatically allocated and stored. Nutritional information can be generated automatically via the scran24 companion Access database. scran24 was very well received by young people and was relatively quick to complete. The accuracy and precision was close to that of similar computer-based systems currently used in dietary studies. © 2013 The Authors Journal of Human Nutrition and Dietetics © 2013 The British Dietetic Association Ltd.

  19. Validated Metrics of Quick Flow Improve Assessments of Streamflow Generation Processes at the Long-Term Sleepers River Research Site

    NASA Astrophysics Data System (ADS)

    Sebestyen, S. D.; Shanley, J. B.

    2015-12-01

    There are multiple approaches to quantify quick flow components of streamflow. Physical hydrograph separations of quick flow using recession analysis (RA) are objective, reproducible, and easily calculated for long-duration streamflow records (years to decades). However, this approach has rarely been validated to have a physical basis for interpretation. In contrast, isotopic hydrograph separation (IHS) and end member mixing analysis using multiple solutes (EMMA) have been used to identify flow components and flowpath routing through catchment soils. Nonetheless, these two approaches are limited by data from limited and isolated periods (hours to weeks) during which more-intensive grab samples were analyzed. These limitations oftentimes make IHS and EMMA difficult to generalize beyond brief windows of time. At the Sleepers River Research Watershed (SRRW) in northern Vermont, USA, we have data from multiple snowmelt events over a two decade period and from multiple nested catchments to assess relationships among RA, IHS, and EMMA. Quick flow separations were highly correlated among the three techniques, which shows links among metrics of quick flow, water sources, and flow path routing in a small (41 ha), forested catchment (W-9) The similarity in responses validates a physical interpretation for a particular RA approach (the Ekhardt recursive RA filter). This validation provides a new tool to estimate new water inputs and flowpath routing for more and longer periods when chemical or isotopic tracers may not have been measured. At three other SRRW catchments, we found similar strong correlations among the three techniques. Consistent responses across four catchments provide evidence to support other research at the SRRW that shows that runoff generation mechanisms are similar despite differences in catchment sizes and land covers.

  20. Applying a multi-replication framework to support dynamic situation assessment and predictive capabilities

    NASA Astrophysics Data System (ADS)

    Lammers, Craig; McGraw, Robert M.; Steinman, Jeffrey S.

    2005-05-01

    Technological advances and emerging threats reduce the time between target detection and action to an order of a few minutes. To effectively assist with the decision-making process, C4I decision support tools must quickly and dynamically predict and assess alternative Courses Of Action (COAs) to assist Commanders in anticipating potential outcomes. These capabilities can be provided through the faster-than-real-time predictive simulation of plans that are continuously re-calibrating with the real-time picture. This capability allows decision-makers to assess the effects of re-tasking opportunities, providing the decision-maker with tremendous freedom to make time-critical, mid-course decisions. This paper presents an overview and demonstrates the use of a software infrastructure that supports DSAP capabilities. These DSAP capabilities are demonstrated through the use of a Multi-Replication Framework that supports (1) predictivie simulations using JSAF (Joint Semi-Automated Forces); (2) real-time simulation, also using JSAF, as a state estimation mechanism; and, (3) real-time C4I data updates through TBMCS (Theater Battle Management Core Systems). This infrastructure allows multiple replications of a simulation to be executed simultaneously over a grid faster-than-real-time, calibrated with live data feeds. A cost evaluator mechanism analyzes potential outcomes and prunes simulations that diverge from the real-time picture. In particular, this paper primarily serves to walk a user through the process for using the Multi-Replication Framework providing an enhanced decision aid.

  1. Radionuclidic purity measurements for cyclotron-produced 99mTc via 100Mo(p,2n) at 18 MeV

    NASA Astrophysics Data System (ADS)

    Buckley, K.; Tanguay, J.; Hou, X.; Stothers, L.; Vuckovic, M.; Frantzen, K.; Cockburn, N.; Corsaut, J.; Dodd, M.; Goodbody, A.; Hanemaayer, V.; Hook, B.; Klug, J.; Kovacs, M.; Kumlin, J.; McDiarmid, S.; McEwan, J.; Prato, F.; Ruddock, P.; Valiant, J.; Zeisler, S.; Ruth, T.; Celler, A.; Benard, F.; Schaffer, P.

    2017-05-01

    The radionuclidic purity of cyclotron-produced 99mTc has been measured by gamma ray spectroscopy and compared to the results of a quick release test modeled after the molybdenum breakthrough test performed on generator-derived 99mTc. Excellent radionuclidic purity is reported for samples produced at BCCA during our clinical trial. The quick release test results agree well with the gamma ray analysis.

  2. Image analysis to evaluate the browning degree of banana (Musa spp.) peel.

    PubMed

    Cho, Jeong-Seok; Lee, Hyeon-Jeong; Park, Jung-Hoon; Sung, Jun-Hyung; Choi, Ji-Young; Moon, Kwang-Deog

    2016-03-01

    Image analysis was applied to examine banana peel browning. The banana samples were divided into 3 treatment groups: no treatment and normal packaging (Cont); CO2 gas exchange packaging (CO); normal packaging with an ethylene generator (ET). We confirmed that the browning of banana peels developed more quickly in the CO group than the other groups based on sensory test and enzyme assay. The G (green) and CIE L(∗), a(∗), and b(∗) values obtained from the image analysis sharply increased or decreased in the CO group. And these colour values showed high correlation coefficients (>0.9) with the sensory test results. CIE L(∗)a(∗)b(∗) values using a colorimeter also showed high correlation coefficients but comparatively lower than those of image analysis. Based on this analysis, browning of the banana occurred more quickly for CO2 gas exchange packaging, and image analysis can be used to evaluate the browning of banana peels. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copps, Kevin D.

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data managementmore » coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.« less

  4. myPhyloDB: a local web server for the storage and analysis of metagenomic data.

    PubMed

    Manter, Daniel K; Korsa, Matthew; Tebbe, Caleb; Delgado, Jorge A

    2016-01-01

    myPhyloDB v.1.1.2 is a user-friendly personal database with a browser-interface designed to facilitate the storage, processing, analysis, and distribution of microbial community populations (e.g. 16S metagenomics data). MyPhyloDB archives raw sequencing files, and allows for easy selection of project(s)/sample(s) of any combination from all available data in the database. The data processing capabilities of myPhyloDB are also flexible enough to allow the upload and storage of pre-processed data, or use the built-in Mothur pipeline to automate the processing of raw sequencing data. myPhyloDB provides several analytical (e.g. analysis of covariance,t-tests, linear regression, differential abundance (DESeq2), and principal coordinates analysis (PCoA)) and normalization (rarefaction, DESeq2, and proportion) tools for the comparative analysis of taxonomic abundance, species richness and species diversity for projects of various types (e.g. human-associated, human gut microbiome, air, soil, and water) for any taxonomic level(s) desired. Finally, since myPhyloDB is a local web-server, users can quickly distribute data between colleagues and end-users by simply granting others access to their personal myPhyloDB database. myPhyloDB is available athttp://www.ars.usda.gov/services/software/download.htm?softwareid=472 and more information along with tutorials can be found on our websitehttp://www.myphylodb.org. Database URL:http://www.myphylodb.org. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the United States.

  5. A PDA study management tool (SMT) utilizing wireless broadband and full DICOM viewing capability

    NASA Astrophysics Data System (ADS)

    Documet, Jorge; Liu, Brent; Zhou, Zheng; Huang, H. K.; Documet, Luis

    2007-03-01

    During the last 4 years IPI (Image Processing and Informatics) Laboratory has been developing a web-based Study Management Tool (SMT) application that allows Radiologists, Film librarians and PACS-related (Picture Archiving and Communication System) users to dynamically and remotely perform Query/Retrieve operations in a PACS network. The users utilizing a regular PDA (Personal Digital Assistant) can remotely query a PACS archive to distribute any study to an existing DICOM (Digital Imaging and Communications in Medicine) node. This application which has proven to be convenient to manage the Study Workflow [1, 2] has been extended to include a DICOM viewing capability in the PDA. With this new feature, users can take a quick view of DICOM images providing them mobility and convenience at the same time. In addition, we are extending this application to Metropolitan-Area Wireless Broadband Networks. This feature requires Smart Phones that are capable of working as a PDA and have access to Broadband Wireless Services. With the extended application to wireless broadband technology and the preview of DICOM images, the Study Management Tool becomes an even more powerful tool for clinical workflow management.

  6. The automated Army ROTC Questionnaire (ARQ)

    NASA Technical Reports Server (NTRS)

    Young, David L. H.

    1991-01-01

    The Reserve Officer Training Corps Cadet Command (ROTCCC) takes applications for its officer training program from college students and Army enlisted personnel worldwide. Each applicant is required to complete a set of application forms prior to acceptance into the ROTC program. These forms are covered by several regulations that govern the eligibility of potential applicants and guide the applicant through the application process. Eligibility criteria changes as Army regulations are periodically revised. Outdated information results in a loss of applications attributable to frustration and error. ROTCCC asked for an inexpensive and reliable way of automating their application process. After reviewing the process, it was determined that an expert system with good end user interface capabilities could be used to solve a large part of the problem. The system captures the knowledge contained within the regulations, enables the quick distribution and implementation of eligibility criteria changes, and distributes the expertise of the admissions personnel to the education centers and colleges. The expert system uses a modified version of CLIPS that was streamlined to make the most efficient use of its capabilities. A user interface with windowing capabilities provides the applicant with a simple and effective way to input his/her personal data.

  7. Rapid Response Risk Assessment in New Project Development

    NASA Technical Reports Server (NTRS)

    Graber, Robert R.

    2010-01-01

    A capability for rapidly performing quantitative risk assessments has been developed by JSC Safety and Mission Assurance for use on project design trade studies early in the project life cycle, i.e., concept development through preliminary design phases. A risk assessment tool set has been developed consisting of interactive and integrated software modules that allow a user/project designer to assess the impact of alternative design or programmatic options on the probability of mission success or other risk metrics. The risk and design trade space includes interactive options for selecting parameters and/or metrics for numerous design characteristics including component reliability characteristics, functional redundancy levels, item or system technology readiness levels, and mission event characteristics. This capability is intended for use on any project or system development with a defined mission, and an example project will used for demonstration and descriptive purposes, e.g., landing a robot on the moon. The effects of various alternative design considerations and their impact of these decisions on mission success (or failure) can be measured in real time on a personal computer. This capability provides a high degree of efficiency for quickly providing information in NASA s evolving risk-based decision environment

  8. TrackMate: An open and extensible platform for single-particle tracking.

    PubMed

    Tinevez, Jean-Yves; Perry, Nick; Schindelin, Johannes; Hoopes, Genevieve M; Reynolds, Gregory D; Laplantine, Emmanuel; Bednarek, Sebastian Y; Shorte, Spencer L; Eliceiri, Kevin W

    2017-02-15

    We present TrackMate, an open source Fiji plugin for the automated, semi-automated, and manual tracking of single-particles. It offers a versatile and modular solution that works out of the box for end users, through a simple and intuitive user interface. It is also easily scriptable and adaptable, operating equally well on 1D over time, 2D over time, 3D over time, or other single and multi-channel image variants. TrackMate provides several visualization and analysis tools that aid in assessing the relevance of results. The utility of TrackMate is further enhanced through its ability to be readily customized to meet specific tracking problems. TrackMate is an extensible platform where developers can easily write their own detection, particle linking, visualization or analysis algorithms within the TrackMate environment. This evolving framework provides researchers with the opportunity to quickly develop and optimize new algorithms based on existing TrackMate modules without the need of having to write de novo user interfaces, including visualization, analysis and exporting tools. The current capabilities of TrackMate are presented in the context of three different biological problems. First, we perform Caenorhabditis-elegans lineage analysis to assess how light-induced damage during imaging impairs its early development. Our TrackMate-based lineage analysis indicates the lack of a cell-specific light-sensitive mechanism. Second, we investigate the recruitment of NEMO (NF-κB essential modulator) clusters in fibroblasts after stimulation by the cytokine IL-1 and show that photodamage can generate artifacts in the shape of TrackMate characterized movements that confuse motility analysis. Finally, we validate the use of TrackMate for quantitative lifetime analysis of clathrin-mediated endocytosis in plant cells. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  9. Automated analysis of urinary stone composition using Raman spectroscopy: pilot study for the development of a compact portable system for immediate postoperative ex vivo application.

    PubMed

    Miernik, Arkadiusz; Eilers, Yvan; Bolwien, Carsten; Lambrecht, Armin; Hauschke, Dieter; Rebentisch, Gunter; Lossin, Phillipp S; Hesse, Albrecht; Rassweiler, Jens J; Wetterauer, Ulrich; Schoenthaler, Martin

    2013-11-01

    We evaluate a compact portable system for immediate automated postoperative ex vivo analysis of urinary stone composition using Raman spectroscopy. Analysis of urinary stone composition provides essential information for the treatment and metaphylaxis of urolithiasis. Currently infrared spectroscopy and x-ray diffraction are used for urinary stone analysis. However, these methods may require complex sample preparation and costly laboratory equipment. In contrast, Raman spectrometers could be a simple and quick strategy for immediate stone analysis. Pure samples of 9 stone components and 159 human urinary calculi were analyzed by Raman spectroscopy using a microscope coupled system at 2 excitation wavelengths. Signal-to-noise ratio, peak positions and the distinctness of the acquired Raman spectra were analyzed and compared. Background fluorescence was removed mathematically. Corrected Raman spectra were used as a reference library for automated classification of native human urinary stones (50). The results were then compared to standard infrared spectroscopy. Signal-to-noise ratio was superior at an excitation wavelength of 532 nm. An automated, computer based classifier was capable of matching spectra from patient samples with those of pure stone components. Consecutive analysis of 50 human stones demonstrated 100% sensitivity and specificity compared to infrared spectroscopy (for components with more than 25% of total composition). Our pilot study indicates that Raman spectroscopy is a valid and reliable technique for determining urinary stone composition. Thus, we propose that the development of a compact and portable system based on Raman spectroscopy for immediate, postoperative stone analysis could represent an invaluable tool for the metaphylaxis of urolithiasis. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  10. Autonomous Commanding of the WIRE Spacecraft

    NASA Technical Reports Server (NTRS)

    Prior, Mike; Walyus, Keith; Saylor, Rick

    1999-01-01

    This paper presents the end-to-end design architecture for an autonomous commanding capability to be used on the Wide Field Infrared Explorer (WIRE) mission for the uplink of command loads during unattended station contacts. The WIRE mission is the fifth and final mission of NASA's Goddard Space Flight Center Small Explorer (SMEX) series to be launched in March of 1999. Its primary mission is the targeting of deep space fields using an ultra-cooled infrared telescope. Due to its mission design WIRE command loads are large (approximately 40 Kbytes per 24 hours) and must be performed daily. To reduce the cost of mission operations support that would be required in order to uplink command loads, the WIRE Flight Operations Team has implemented an autonomous command loading capability. This capability allows completely unattended operations over a typical two- day weekend period. The key factors driving design and implementation of this capability were: 1) Integration with already existing ground system autonomous capabilities and systems, 2) The desire to evolve autonomous operations capabilities based upon previous SMEX operations experience 3) Integration with ground station operations - both autonomous and man-tended, 4) Low cost and quick implementation, and 5) End-to-end system robustness. A trade-off study was performed to examine these factors in light of the low-cost, higher-risk SMEX mission philosophy. The study concluded that a STOL (Spacecraft Test and Operations Language) based script, highly integrated with other scripts used to perform autonomous operations, was best suited given the budget and goals of the mission. Each of these factors is discussed to provide an overview of the autonomous operations capabilities implemented for the mission. The capabilities implemented on the WIRE mission are an example of a low-cost, robust, and efficient method for autonomous command loading when implemented with other autonomous features of the ground system. They can be used as a design and implementation template by other small satellite missions interested in evolving toward autonomous and lower cost operations.

  11. General Mission Analysis Tool (GMAT) User's Guide (Draft)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    4The General Mission Analysis Tool (GMAT) is a space trajectory optimization and mission analysis system. This document is a draft of the users guide for the tool. Included in the guide is information about Configuring Objects/Resources, Object Fields: Quick Look-up Tables, and Commands and Events.

  12. A Review of Meta-Analysis Packages in R

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.

    2017-01-01

    Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…

  13. CFD analysis of jet mixing in low NOx flametube combustors

    NASA Technical Reports Server (NTRS)

    Talpallikar, M. V.; Smith, C. E.; Lai, M. C.; Holdeman, J. D.

    1991-01-01

    The Rich-burn/Quick-mix/Lean-burn (RQL) combustor was identified as a potential gas turbine combustor concept to reduce NO(x) emissions in High Speed Civil Transport (HSCT) aircraft. To demonstrate reduced NO(x) levels, cylindrical flametube versions of RQL combustors are being tested at NASA Lewis Research Center. A critical technology needed for the RQL combustor is a method of quickly mixing by-pass combustion air with rich-burn gases. Jet mixing in a cylindrical quick-mix section was numerically analyzed. The quick-mix configuration was five inches in diameter and employed twelve radial-inflow slots. The numerical analyses were performed with an advanced, validated 3-D Computational Fluid Dynamics (CFD) code named REFLEQS. Parametric variation of jet-to-mainstream momentum flux ratio (J) and slot aspect ratio was investigated. Both non-reacting and reacting analyses were performed. Results showed mixing and NO(x) emissions to be highly sensitive to J and slot aspect ratio. Lowest NO(x) emissions occurred when the dilution jet penetrated to approximately mid-radius. The viability of using 3-D CFD analyses for optimizing jet mixing was demonstrated.

  14. CFD analysis of jet mixing in low NO(x) flametube combustors

    NASA Technical Reports Server (NTRS)

    Talpallikar, M. V.; Smith, C. E.; Lai, M. C.; Holdeman, J. D.

    1991-01-01

    The Rich-burn/Quick-mix/Lean-burn (RQL) combustor has been identified as a potential gas turbine combustor concept to reduce NO(x) emissions in High Speed Civil Transport (HSCT) aircraft. To demonstrate reduced NO(x) levels, cylindrical flametube versions of RQL combustors are being tested at NASA Lewis Research Center. A critical technology needed for the RQL combustor is a method of quickly mixing by-pass combustion air with rich-burn gases. Jet mixing in a cylindrical quick-mix section was numerically analyzed. The quick-mix configuration was five inches in diameter and employed twelve radial-inflow slots. The numerical analyses were performed with an advanced, validated 3D Computational Fluid Dynamics (CFD) code named REFLEQS. Parametric variation of jet-to-mainstream momentum flux ratio (J) and slot aspect ratio was investigated. Both non-reacting and reacting analyses were performed. Results showed mixing and NO(x) emissions to be highly sensitive to J and slot aspect ratio. Lowest NO(x) emissions occurred when the dilution jet penetrated to approximately mid-radius. The viability of using 3D CFD analyses for optimizing jet mixing was demonstrated.

  15. Astrophysical Research Consortium Telescope Imaging Camera (ARCTIC) facility optical imager for the Apache Point Observatory 3.5m telescope

    NASA Astrophysics Data System (ADS)

    Huehnerhoff, Joseph; Ketzeback, William; Bradley, Alaina; Dembicky, Jack; Doughty, Caitlin; Hawley, Suzanne; Johnson, Courtney; Klaene, Mark; Leon, Ed; McMillan, Russet; Owen, Russell; Sayres, Conor; Sheen, Tyler; Shugart, Alysha

    2016-08-01

    The Astrophysical Research Consortium Telescope Imaging Camera, ARCTIC, is a new optical imaging camera now in use at the Astrophysical Research Consortium (ARC) 3.5m telescope at Apache Point Observatory (APO). As a facility instrument, the design criteria broadly encompassed many current and future science opportunities, and the components were built for quick repair or replacement, to minimize down-time. Examples include a quick change shutter, filter drive components accessible from the exterior and redundant amplifiers on the detector. The detector is a Semiconductor Technology Associates (STA) device with several key properties (e.g. high quantum efficiency, low read-noise, quick readout, minimal fringing, operational bandpass 350-950nm). Focal reducing optics (f/10.3 to f/8.0) were built to control aberrations over a 7.8'x7.8' field, with a plate scale of 0.11" per 0.15 micron pixel. The instrument body and dewar were designed to be simple and robust with only two components to the structure forward of the dewar, which in turn has minimal feedthroughs and permeation areas and holds a vacuum <10-8 Torr. A custom shutter was also designed, using pneumatics as the driving force. This device provides exceptional performance and reduces heat near the optical path. Measured performance is repeatable at the 2ms level and offers field uniformity to the same level of precision. The ARCTIC facility imager will provide excellent science capability with robust operation and minimal maintenance for the next decade or more at APO.

  16. Object-based habitat mapping using very high spatial resolution multispectral and hyperspectral imagery with LiDAR data

    NASA Astrophysics Data System (ADS)

    Onojeghuo, Alex Okiemute; Onojeghuo, Ajoke Ruth

    2017-07-01

    This study investigated the combined use of multispectral/hyperspectral imagery and LiDAR data for habitat mapping across parts of south Cumbria, North West England. The methodology adopted in this study integrated spectral information contained in pansharp QuickBird multispectral/AISA Eagle hyperspectral imagery and LiDAR-derived measures with object-based machine learning classifiers and ensemble analysis techniques. Using the LiDAR point cloud data, elevation models (such as the Digital Surface Model and Digital Terrain Model raster) and intensity features were extracted directly. The LiDAR-derived measures exploited in this study included Canopy Height Model, intensity and topographic information (i.e. mean, maximum and standard deviation). These three LiDAR measures were combined with spectral information contained in the pansharp QuickBird and Eagle MNF transformed imagery for image classification experiments. A fusion of pansharp QuickBird multispectral and Eagle MNF hyperspectral imagery with all LiDAR-derived measures generated the best classification accuracies, 89.8 and 92.6% respectively. These results were generated with the Support Vector Machine and Random Forest machine learning algorithms respectively. The ensemble analysis of all three learning machine classifiers for the pansharp QuickBird and Eagle MNF fused data outputs did not significantly increase the overall classification accuracy. Results of the study demonstrate the potential of combining either very high spatial resolution multispectral or hyperspectral imagery with LiDAR data for habitat mapping.

  17. On Shaft Data Acquisition System (OSDAS)

    NASA Technical Reports Server (NTRS)

    Pedings, Marc; DeHart, Shawn; Formby, Jason; Naumann, Charles

    2012-01-01

    On Shaft Data Acquisition System (OSDAS) is a rugged, compact, multiple-channel data acquisition computer system that is designed to record data from instrumentation while operating under extreme rotational centrifugal or gravitational acceleration forces. This system, which was developed for the Heritage Fuel Air Turbine Test (HFATT) program, addresses the problem of recording multiple channels of high-sample-rate data on most any rotating test article by mounting the entire acquisition computer onboard with the turbine test article. With the limited availability of slip ring wires for power and communication, OSDAS utilizes its own resources to provide independent power and amplification for each instrument. Since OSDAS utilizes standard PC technology as well as shared code interfaces with the next-generation, real-time health monitoring system (SPARTAA Scalable Parallel Architecture for Real Time Analysis and Acquisition), this system could be expanded beyond its current capabilities, such as providing advanced health monitoring capabilities for the test article. High-conductor-count slip rings are expensive to purchase and maintain, yet only provide a limited number of conductors for routing instrumentation off the article and to a stationary data acquisition system. In addition to being limited to a small number of instruments, slip rings are prone to wear quickly, and introduce noise and other undesirable characteristics to the signal data. This led to the development of a system capable of recording high-density instrumentation, at high sample rates, on the test article itself, all while under extreme rotational stress. OSDAS is a fully functional PC-based system with 48 channels of 24-bit, high-sample-rate input channels, phase synchronized, with an onboard storage capacity of over 1/2-terabyte of solid-state storage. This recording system takes a novel approach to the problem of recording multiple channels of instrumentation, integrated with the test article itself, packaged in a compact/rugged form factor, consuming limited power, all while rotating at high turbine speeds.

  18. 78 FR 76711 - Royal City Charter Coach Lines Ltd.-Acquisition of Control-Quick Coach Lines Ltd. d/b/a Quick...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ... current ownership of Quick, and its wholly owned subsidiary Quick Coach Lines USA Inc. (Quick USA... passengers (MC-205116). Quick USA is a wholly owned subsidiary of Quick. When Royal acquires control of Quick, it will also obtain control of Quick USA. Quick USA is currently inactive and does not provide any...

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandes, Ana; Pereira, Rita C.; Sousa, Jorge

    The Instituto de Plasmas e Fusao Nuclear (IPFN) has developed dedicated re-configurable modules based on field programmable gate array (FPGA) devices for several nuclear fusion machines worldwide. Moreover, new Advanced Telecommunication Computing Architecture (ATCA) based modules developed by IPFN are already included in the ITER catalogue. One of the requirements for re-configurable modules operating in future nuclear environments including ITER is the remote update capability. Accordingly, this work presents an alternative method for FPGA remote programing to be implemented in new ATCA based re-configurable modules. FPGAs are volatile devices and their programming code is usually stored in dedicated flash memoriesmore » for properly configuration during module power-on. The presented method is capable to store new FPGA codes in Serial Peripheral Interface (SPI) flash memories using the PCIexpress (PCIe) network established on the ATCA back-plane, linking data acquisition endpoints and the data switch blades. The method is based on the Xilinx Quick Boot application note, adapted to PCIe protocol and ATCA based modules. (authors)« less

  20. Applications Explorer Missions (AEM): Mission planners handbook

    NASA Technical Reports Server (NTRS)

    Smith, S. R. (Editor)

    1974-01-01

    The Applications Explorer Missions (AEM) Program is a planned series of space applications missions whose purpose is to perform various tasks that require a low cost, quick reaction, small spacecraft in a dedicated orbit. The Heat Capacity Mapping Mission (HCMM) is the first mission of this series. The spacecraft described in this document was conceived to support a variety of applications instruments and the HCMM instrument in particular. The maximum use of commonality has been achieved. That is, all of the subsystems employed are taken directly or modified from other programs such as IUE, IMP, RAE, and Nimbus. The result is a small versatile spacecraft. The purpose of this document, the AEM Mission Planners Handbook (AEM/MPH) is to describe the spacecraft and its capabilities in general and the HCMM in particular. This document will also serve as a guide for potential users as to the capabilities of the AEM spacecraft and its achievable orbits. It should enable each potential user to determine the suitability of the AEM concept to his mission.

Top