Sample records for data plotters

  1. Subroutines GEORGE and DRASTC simplify operation of automatic digital plotter

    NASA Technical Reports Server (NTRS)

    Englel, F., III; Gray, W. H.; Richard, P. J.

    1967-01-01

    FORTRAN language subroutines enable the production of a tape for a 360-30 tape unit that controls the CALCOMP 566 Digital Incremental Plotter. This provides the plotter with instructions for graphically displaying data points with the proper scaling of axes, numbering, lettering, and tic marking.

  2. Experimental and Computational Modeling of Rarefaction Wave Eliminators Suitable for the BRL 2.44 m Shock Tube

    DTIC Science & Technology

    1983-06-01

    made directly from the oscilloscope. Finai data processing was completed with the computer, printer , and plotter. Tables and plots of pressure-time...BASIC DATA ACQUISITION PRINTER FINAL DATA REDUCTION TEKTRONIX 4641 HARD COPY[ TEKTRONIX 4631 PLOTTER COMPUTER TEKTRONIX TEKTRONIX I 4662 4052 DIGITAL...79409 Columbus, OH 43201 1 University of Arkansas 1 Director Department of Physics Applied Physics Laboratory ATTN: Prof 0. Zinke The Johns Hopkins

  3. Towards Standardization in Terminal Ballistics Testing: Velocity Representation

    DTIC Science & Technology

    1976-01-01

    d vd vr does not exist at vV, it is true that -. Also avs rd s t d v s approximates...29 3b. Sample of plotter output: v versus v s -r.. ....... .. 30s S 3c. Sample of plotter output: v /vs versus vr/avs. ...... 31 I ’ i Li- Preceding...implicit in sets of ( v s , v r) data. A form is proposed as being sufficiently simple and versatile to usefully and realistically model

  4. High Energy Electron Radiation Degradation of Gallium Arsenide Solar Cells.

    DTIC Science & Technology

    1986-03-01

    Subroutine Print Instruct ions Print / Completed/ Sample Ch . 0 Return Calculate C1 6 Print C1 IISample Ch . 2 70 -j.. .,.1-I.,.... ,.L.L...PLOTTER * 270 * AND THE CONNECTION DIAGRAM FOR THE SYSTEM MAY * 280 " * MAY BE FOUND IN CH 2, FIGURE (3). THE GPIB * 290’ * DRIVER IS REPRODUCED FROM REF...PLOT I-V CURVE ON HP 7845 PLOTTER." 2180 PRINT 2190 PRINT ř. PLOT I-V CURVE ON RGB MONITOR." 2200 PRINT 2210 PRINT Ś. WRITE I-V DATA TO FLOPPY DISK

  5. American Jihadist Terrorism: Combating a Complex Threat

    DTIC Science & Technology

    2010-12-07

    Esposito, “Terror Raids at JFK Airport Net American Alleged Terror Plotters Headed for Somalia,” abcnews.com, June 6, 2010, http://abcnews.go.com...Blotter/terror-raids- jfk - airport -net-alleged-terror-plotters/story?id= 10839045. 186 U.S. v. Mohamed Alessa and Carlos E. Almonte, Criminal Complaint... Airport Net American Alleged Terror Plotters Headed for Somalia,” ABC News, June 6, 2010, http://abcnews.go.com/Blotter/terror-raids- jfk - airport -net

  6. American Jihadist Terrorism: Combating a Complex Threat

    DTIC Science & Technology

    2010-09-20

    www.nytimes.com/2010/07/30/us/30fbi.html. 177 Richard Esposito, “Terror Raids at JFK Airport Net American Alleged Terror Plotters Headed for Somalia...abcnews.com, June 6, 2010, http://abcnews.go.com/Blotter/terror-raids- jfk - airport -net-alleged-terror-plotters/story?id= 10839045. 178 U.S. v. Mohamed... Airport Net American Alleged Terror Plotters Headed for Somalia,” ABC News, June 6, 2010, http://abcnews.go.com/Blotter/terror-raids- jfk - airport -net

  7. WCPP-THE WOLF PLOTTING AND CONTOURING PACKAGE

    NASA Technical Reports Server (NTRS)

    Masaki, G. T.

    1994-01-01

    The WOLF Contouring and Plotting Package provides the user with a complete general purpose plotting and contouring capability. This package is a complete system for producing line printer, SC4020, Gerber, Calcomp, and SD4060 plots. The package has been designed to be highly flexible and easy to use. Any plot from a quick simple plot (which requires only one call to the package) to highly sophisticated plots (including motion picture plots) can be easily generated with only a basic knowledge of FORTRAN and the plot commands. Anyone designing a software system that requires plotted output will find that this package offers many advantages over the standard hardware support packages available. The WCPP package is divided into a plot segment and a contour segment. The plot segment can produce output for any combination of line printer, SC4020, Gerber, Calcomp, and SD4060 plots. The line printer plots allow the user to have plots available immediately after a job is run at a low cost. Although the resolution of line printer plots is low, the quick results allows the user to judge if a high resolution plot of a particular run is desirable. The SC4020 and SD4060 provide high speed high resolution cathode ray plots with film and hard copy output available. The Gerber and Calcomp plotters provide very high quality (of publishable quality) plots of good resolution. Being bed or drum type plotters, the Gerber and Calcomp plotters are usually slow and not suited for large volume plotting. All output for any or all of the plotters can be produced simultaneously. The types of plots supported are: linear, semi-log, log-log, polar, tabular data using the FORTRAN WRITE statement, 3-D perspective linear, and affine transformations. The labeling facility provides for horizontal labels, vertical labels, diagonal labels, vector characters of a requested size (special character fonts are easily implemented), and rotated letters. The gridding routines label the grid lines according to user specification. Special line features include multiple lines, dashed lines, and tic marks. The contour segment of this package is a collection of subroutines which can be used to produce contour plots and perform related functions. The package can contour any data which can be placed on a grid or data which is regularly spaced, including any general affine or polar grid data. The package includes routines which will grid random data. Contour levels can be specified at any values desired. Input data can be smoothed with undefined points being acceptable where data is unreliable or unknown. Plots which are extremely large or detailed can be automatically output in parts to improve resolution or overcome plotter size limitations. The contouring segment uses the plot segment for actual plotting, thus all the features described for the plotting segment are available to the user of the contouring segment. Included with this package are two data bases for producing world map plots in Mercator projection. One data base provides just continent outlines and another provides continent outlines and national borders in great detail. This package is written in FORTRAN IV and IBM OS ASSEMBLER and has been implemented on an IBM 360 with a central memory requirement of approximately 140K of 8 bit bytes. The ASSEMBLER routines are basic plotter interface routines. The WCPP package was developed in 1972.

  8. LOFT data acquisition and visual display system (DAVDS) presentation program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bullock, M.G.; Miyasaki, F.S.

    1976-03-01

    The Data Acquisition and Visual Display System (DAVDS) at the Loss-of-Fluid Test Facility (LOFT) has 742 data channel recording capability of which 576 are recorded digitally. The purpose of this computer program is to graphically present the data acquired and/or processed by the LOFT DAVDS. This program takes specially created plot data buffers of up to 1024 words and generates time history plots on the system electrostatic printer-plotter. The data can be extracted from two system input devices: Magnetic disk or digital magnetic tape. Versatility has been designed in the program by providing the user three methods of scaling plots:more » Automatic, control record, and manual. Time required to produce a plot on the system electrostatic printer-plotter varies from 30 to 90 seconds depending on the options selected. The basic computer and program details are described.« less

  9. OverPlotter: A Utility for Herschel Data Processing

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Mei, Y.; Schulz, B.

    2008-08-01

    The OverPlotter utility is a GUI tool written in Java to support interactive data processing (DP) and analysis for the Herschel Space Observatory within the framework of the Herschel Common Science System (HCSS)(Wieprecht et al 2004). The tool expands upon the capabilities of the TableViewer (Zhang & Schulz 2005), providing now also the means to create additional overlays of several X/Y scatter plots within the same display area. These layers can be scaled and panned, either individually, or together as one graph. Visual comparison of data with different origins and units becomes much easier. The number of available layers is not limited, except by computer memory and performance. Presentation images can be easily created by adding annotations, labeling layers and setting colors. The tool will be very helpful especially in the early phases of Herschel data analysis, when a quick access to contents of data products is important.

  10. LANDSAT digital data for water pollution and water quality studies in Southern Scandinavia

    NASA Technical Reports Server (NTRS)

    Hellden, U.; Akersten, I.

    1977-01-01

    Spectral diagrams, illustrating the spectral characteristics of different water types, were constructed by means of simple statistical analysis of the various reflectance properties of water areas in Southern Scandinavia as registered by LANDSAT-1. There were indications that water whose spectral reproduction is dominated by chlorophyllous matter (phytoplankton) can be distinguished from water dominated by nonchlorophyllous matter. Differences between lakes, as well as the patchiness of individual lakes, concerning secchi disc transparency could be visualized after classification and reproduction in black and white and in color by means of line printer, calcomp plotter (CRT), and ink jet plotter respectively.

  11. Developments in analytical instrumentation

    NASA Astrophysics Data System (ADS)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead of the data being used mostly in the digital mapping systems operated in-house by mapping organisations. These various new hardware and software developments are reported upon and analysed in this Invited Paper presented to ISPRS Commission II at the 1988 Kyoto Congress.

  12. Linking of the BENSON graph-plotter with the Elektronika-1001 computer

    NASA Technical Reports Server (NTRS)

    Valtts, I. Y.; Nilolaev, N. Y.; Popov, M. V.; Soglasnov, V. A.

    1980-01-01

    A device, developed by the Institute of Space Research of the Academy of Sciences of the USSR, for linking the Elektronika-100I computer with the BENSON graph-plotter is described. Programs are compiled which provide display of graphic and alphanumeric information. Instructions for their utilization are given.

  13. Development of medical data information systems

    NASA Technical Reports Server (NTRS)

    Anderson, J.

    1971-01-01

    Computerized storage and retrieval of medical information is discussed. Tasks which were performed in support of the project are: (1) flight crew health stabilization computer system, (2) medical data input system, (3) graphic software development, (4) lunar receiving laboratory support, and (5) Statos V printer/plotter software development.

  14. Oscilloscope used as X-Y plotter or two-dimensional analyzer

    NASA Technical Reports Server (NTRS)

    Hansen, D.; Roy, N.

    1967-01-01

    Oscilloscope used as an X-Y plotter or two-dimensional analyzer tags each point with a yes or no, depending on a third parameter. The usual square-wave pulse is replaced on the scope by a single information-bearing dot which lengthens to a dash in response to a simultaneous event.

  15. Introduction to the LRAPP Environmental-Acoustic Data Bank

    DTIC Science & Technology

    1979-06-01

    those provided by tte Data 3ank are also possible via the CRE&T! module. 3-. W _ý REGIONAL DATA BASE COMPONENT DATA FILES PLOTTER TABULR CHRTSNAAPS...stems Group 7600 C"lhire Drive McLean, VA 22101• ~ATTN: R T. Brown I. Gereben Undersea Res,.arch Corp. 7777 Leesburg !ulke Suite 306 Falls Church, VA

  16. Circuit board routing attachment for Fermilab Gerber plotter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindenmeyer, C.

    1984-05-10

    A new and potentially important method of producing large circuit boards has been developed at Fermilab. A Gerber Flat Bed Plotter with an active area of 5' x 16' has been fitted with a machining head to produce a circuit board without the use of photography or chemicals. The modifications of the Gerber Plotter do not impair its use as a photoplotter or pen plotter, the machining head is merely exchanged with the standard attachments. The modifications to the program are minimal; this will be described in another report. The machining head is fitted with an air bearing motorized spindlemore » driven at a speed of 40,000 rpm to 90,000 rpm. The spindle also is provided with air bearings on its outside diameter, offering frictionless vertical travel guidance. Vertical travel of the spindle is driven by a spring return single acting air cylinder. An adjustable hydraulic damper slows the spindle travel near the end of its downward stroke. Two programmable stops control spindle down stroke position, and limit switches are provided for position feedback to the control system. A vacuum system collects chips at the cutter head. No lubrication or regular maintenance is required. The circuit board to be fabricated is supported on a porous plastic mat which allows table vacuum to hold the board in place while allowing the cutters or drills to cut through the board without damaging the rubber platen of the plotter. The perimeter of the board must be covered to the limits of the table vacuum area used to prevent excessive leakage.« less

  17. Function Plotters for Secondary Math Teachers. A MicroSIFT Quarterly Report.

    ERIC Educational Resources Information Center

    Weaver, Dave; And Others

    This report examines mathematical graphing utilities or function plotters for use in introductory algebra classes of more advanced courses. Each product selected for inclusion in this report is able to construct the graph of a given equation on the screen and serves as a utility which may be used by the student for an open-ended exploration of a…

  18. Development of Low-cost plotter for educational purposes using Arduino

    NASA Astrophysics Data System (ADS)

    Karthik, Siriparapu; Thirumal Reddy, Palwai; Marimuthu, K. Prakash

    2017-08-01

    With the development of CAD/CAM/CAE concept to product realization time has reduced drastically. Most of the activities such as design, drafting, and visualizations are carried out using high-end computers and commercial software. This has reduced the overall lead-time to market. It is important in the current scenario to equip the students with knowledge of advanced technological developments in order to use them effectively. However, the cost associated with the systems are very high which is not affordable to students. The present work is an attempt to build a low-cost plotter integrating some of the software that are available and components got from scrapped electronic devices. Here the authors are introducing G-code plotter with 3-axis which can implement the given g-code in 2D plane (X-Y). Lifting pen and adjusting to the base component is in the Z-axis. All conventional plotting devices existing until date are costly and need basic knowledge before operating. Our aim is to make students understand the working of plotter and the usage of G-code, achieving this at a much affordable cost. Arduino Uno controls the stepper motors, which can accurately plot the given dimensions.

  19. [The development of an intelligent four-channel aggregometer].

    PubMed

    Guan, X; Wang, M

    1998-07-01

    The paper introduces the hardware and software design of the instrument. We use 89C52 single-chip computer as the microprocessor to control the amplifier, AD and DA conversion chip to realize the sampling, data process, printout and supervision. The final result is printed out in form of data and aggregation curve from PP40 plotter.

  20. Rapid Prototyping: State of the Art

    DTIC Science & Technology

    2003-10-23

    Rapid Prototyping SCS Solid Creation System SLM Selective Laser Melting SLP Solid Laser diode Plotter SLS Selective Laser Sintering SOAR State of the...121,000, respectively. SLP stands for Sold Laser Diode Plotter. The machines are relatively slow and parts are small, so, to date, the products have been...Gigerenzer, H., “Directed Laser Welding of Metal Matrix Composite Structures for Space Based Applications,“ Triton Systems Inc., Chelmsford, MA., 1

  1. Non Contacting Evaluation of Strains and Cracking Using Optical and Infrared Imaging Techniques

    DTIC Science & Technology

    1988-08-22

    Compatible Zenith Z-386 microcomputer with plotter II. 3-D Motion Measurinq System 1. Complete OPTOTRAK three dimensional digitizing system. System includes...acquisition unit - 16 single ended analog input channels 3. Data Analysis Package software (KINEPLOT) 4. Extra OPTOTRAK Camera (max 224 per system

  2. CERC Field Research Facility Environmental Data Summary, 1977-79.

    DTIC Science & Technology

    1982-12-01

    Motorola "Mini-Ranger," coupled to a Hewlett-Packard Mini-Computer and flatbed plotter. This positioning system was put together and operated by Prank... laminations within the core. While one diver collected the sample, the second diver recorded conditions on the bottom. This description included sediment

  3. HEATPLOT: a temperature distribution plotting program for heating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elrod, D.C.; Turner, W.D.

    1977-07-01

    HEATPLOT is a temperature distribution plotting program that may be used with HEATING5, a generalized heat conduction code. HEATPLOT is capable of drawing temperature contours (isotherms), temperature-time profiles, and temperature-distance profiles from the current HEATING5 temperature distribution or from temperature changes relative to the initial temperature distribution. Contour plots may be made for two- or three-dimensional models. Temperature-time profiles and temperature-distance profiles may be made for one-, two-, and three-dimensional models. HEATPLOT is an IBM 360/370 computer code which uses the DISSPLA plotting package. Plots may be created on the CALCOMP pen-and-ink, and CALCOMP cathode ray tube (CRT), or themore » EAI pen-and-ink plotters. Printer plots may be produced or a compressed data set that may be routed to any of the available plotters may be made.« less

  4. Simple and fast polydimethylsiloxane (PDMS) patterning using a cutting plotter and vinyl adhesives to achieve etching results.

    PubMed

    Hyun Kim; Sun-Young Yoo; Ji Sung Kim; Zihuan Wang; Woon Hee Lee; Kyo-In Koo; Jong-Mo Seo; Dong-Il Cho

    2017-07-01

    Inhibition of polydimethylsiloxane (PDMS) polymerization could be observed when spin-coated over vinyl substrates. The degree of polymerization, partially curing or fully curing, depended on the PDMS thickness coated over the vinyl substrate. This characteristic was exploited to achieve simple and fast PDMS patterning method using a vinyl adhesive layer patterned through a cutting plotter. The proposed patterning method showed results resembling PDMS etching. Therefore, patterning PDMS over PDMS, glass, silicon, and gold substrates were tested to compare the results with conventional etching methods. Vinyl stencils with widths ranging from 200μm to 1500μm were used for the procedure. To evaluate the accuracy of the cutting plotter, stencil designed on the AutoCAD software and the actual stencil widths were compared. Furthermore, this method's accuracy was also evaluated by comparing the widths of the actual stencils and etched PDMS results.

  5. Particle parameter analyzing system. [x-y plotter circuits and display

    NASA Technical Reports Server (NTRS)

    Hansen, D. O.; Roy, N. L. (Inventor)

    1969-01-01

    An X-Y plotter circuit apparatus is described which displays an input pulse representing particle parameter information, that would ordinarily appear on the screen of an oscilloscope as a rectangular pulse, as a single dot positioned on the screen where the upper right hand corner of the input pulse would have appeared. If another event occurs, and it is desired to display this event, the apparatus is provided to replace the dot with a short horizontal line.

  6. Circuit For Current-vs.-Voltage Tests Of Semiconductors

    NASA Technical Reports Server (NTRS)

    Huston, Steven W.

    1991-01-01

    Circuit designed for measurement of dc current-versus-voltage characteristics of semiconductor devices. Operates in conjunction with x-y pen plotter or digital storage oscilloscope, which records data. Includes large feedback resistors to prevent high currents damaging device under test. Principal virtues: low cost, simplicity, and compactness. Also used to evaluate diodes and transistors.

  7. Rarefaction Wave Eliminator Concepts For A Large Blast/Thermal Simulator.

    DTIC Science & Technology

    1985-02-01

    hard copies of the pressure-time records. Final data process- ing was completed with the computer, printer , and plotter. Plots of pressure- time records...F ATTN: Prof 0. Zinke Fayetteville, AR 72701 Cdr, CRDC, AMCCOM ATTI: 4O-SPS-IL University of California PM=-J Lawrence Livermore Lab SOM-RSP-A ATTN

  8. CASPER: A GENERALIZED PROGRAM FOR PLOTTING AND SCALING DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lietzke, M.P.; Smith, R.E.

    A Fortran subroutine was written to scale floating-point data and generate a magnetic tape to plot it on the Calcomp 570 digital plotter. The routine permits a great deal of flexibility, and may be used with any type of FORTRAN or FAP calling program. A simple calling program was also written to permit the user to read in data from cards and plot it without any additional programming. Both the Fortran and binary decks are available. (auth)

  9. An Infrared Data Acquisition and Processing System

    DTIC Science & Technology

    1977-09-01

    Display Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Terminai High Speed Printer/Plotter . . . . Digital Tape Unit...In addition to the recently procured Honeywell Model 96 analog re- corder, a High Density digital tape unit is planned. This unit will increase the...diagram of Figure 1 we see that a Digital Equipment Corp. (DEC) PDP-11/15 minicomputer with 28K of core memory drives the digital section of IRDAPS

  10. Experiences with semiautomatic aerotriangulation on digital photogrammetric stations

    NASA Astrophysics Data System (ADS)

    Kersten, Thomas P.; Stallmann, Dirk

    1995-12-01

    With the development of higher-resolution scanners, faster image-handling capabilities, and higher-resolution screens, digital photogrammetric workstations promise to rival conventional analytical plotters in functionality, i.e. in the degree of automation in data capture and processing, and in accuracy. The availability of high quality digital image data and inexpensive high capacity fast mass storage offers the capability to perform accurate semi- automatic or automatic triangulation of digital aerial photo blocks on digital photogrammetric workstations instead of analytical plotters. In this paper, we present our investigations and results on two photogrammetric triangulation blocks, the OEEPE (European Organisation for Experimental Photogrammetric Research) test block (scale 1;4'000) and a Swiss test block (scale 1:12'000) using digitized images. Twenty-eight images of the OEEPE test block were scanned on the Zeiss/Intergraph PS1 and the digital images were delivered with a resolution of 15 micrometer and 30 micrometer, while 20 images of the Swiss test block were scanned on the Desktop Publishing Scanner Agfa Horizon with a resolution of 42 micrometer and on the PS1 with 15 micrometer. Measurements in the digital images were performed on the commercial Digital photogrammetric Station Leica/Helava DPW770 and with basic hard- and software components of the Digital Photogrammetric Station DIPS II, an experimental system of the Institute of Geodesy and Photogrammetry, ETH Zurich. As a reference, the analog images of both photogrammetric test blocks were measured at analytical plotters. On DIPS II measurements of fiducial marks, signalized and natural tie points were performed by least squares template and image matching, while on DPW770 all points were measured by the cross correlation technique. The observations were adjusted in a self-calibrating bundle adjustment. The comparisons between these results and the experiences with the functionality of the commercial and the experimental system are presented.

  11. General purpose film plotting system

    NASA Technical Reports Server (NTRS)

    Mcquillan, C.

    1977-01-01

    The general purpose film plotting system which is a plot program design to handle a majority of the data tape formats presently available under OS/360 was discussed. The convenience of this program is due to the fact that the user merely describes the format of his data set and the type of data plots he desires. It processes the input data according to the given specifications. The output is generated on a tape which yields data plots when processed by the selected plotter. A summary of each job is produced on the printer.

  12. Natural resources information system.

    NASA Technical Reports Server (NTRS)

    Leachtenauer, J. C.; Woll, A. M.

    1972-01-01

    A computer-based Natural Resources Information System was developed for the Bureaus of Indian Affairs and Land Management. The system stores, processes and displays data useful to the land manager in the decision making process. Emphasis is placed on the use of remote sensing as a data source. Data input consists of maps, imagery overlays, and on-site data. Maps and overlays are entered using a digitizer and stored as irregular polygons, lines and points. Processing functions include set intersection, union and difference and area, length and value computations. Data output consists of computer tabulations and overlays prepared on a drum plotter.

  13. Computer-assisted photogrammetric mapping systems for geologic studies-A progress report

    USGS Publications Warehouse

    Pillmore, C.L.; Dueholm, K.S.; Jepsen, H.S.; Schuch, C.H.

    1981-01-01

    Photogrammetry has played an important role in geologic mapping for many years; however, only recently have attempts been made to automate mapping functions for geology. Computer-assisted photogrammetric mapping systems for geologic studies have been developed and are currently in use in offices of the Geological Survey of Greenland at Copenhagen, Denmark, and the U.S. Geological Survey at Denver, Colorado. Though differing somewhat, the systems are similar in that they integrate Kern PG-2 photogrammetric plotting instruments and small desk-top computers that are programmed to perform special geologic functions and operate flat-bed plotters by means of specially designed hardware and software. A z-drive capability, in which stepping motors control the z-motions of the PG-2 plotters, is an integral part of both systems. This feature enables the computer to automatically position the floating mark on computer-calculated, previously defined geologic planes, such as contacts or the base of coal beds, throughout the stereoscopic model in order to improve the mapping capabilities of the instrument and to aid in correlation and tracing of geologic units. The common goal is to enhance the capabilities of the PG-2 plotter and provide a means by which geologists can make conventional geologic maps more efficiently and explore ways to apply computer technology to geologic studies. ?? 1981.

  14. Multi-model stereo restitution

    USGS Publications Warehouse

    Dueholm, K.S.

    1990-01-01

    Methods are described that permit simultaneous orientation of many small-frame photogrammetric models in an analytical plotter. The multi-model software program enables the operator to move freely between the oriented models during interpretation and mapping. Models change automatically when the measuring mark is moved from one frame to another, moving to the same ground coordinates in the neighboring model. Thus, data collection and plotting can be performed continuously across model boundaries. The orientation of the models is accomplished by a bundle block adjustment. -from Author

  15. Multidisciplinary geoscientific experiments in central Europe

    NASA Technical Reports Server (NTRS)

    Bannert, D. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. Studies were carried out in the fields of geology-pedology, coastal dynamics, geodesy-cartography, geography, and data processing. In geology-pedology, a comparison of ERTS image studies with extensive ground data led to a better understanding of the relationship between vegetation, soil, bedrock, and other geologic features. Findings in linear tectonics gave better insight in orogeny and ore deposit development for prospecting. Coastal studies proved the value of ERTS images for the updating of nautical charts, as well as small scale topographic maps. A plotter for large scale high speed image generation from CCT was developed.

  16. Selected Tether Applications Cost Model

    NASA Technical Reports Server (NTRS)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  17. Geologic map of the Cochiti Dam quadrangle, Sandoval County, New Mexico

    USGS Publications Warehouse

    Dethier, David P.; Thompson, Ren A.; Hudson, Mark R.; Minor, Scott A.; Sawyer, David A.

    2011-01-01

    The mapped distribution of units is based primarily on interpretation of 1:16,000-scale, color aerial photographs taken in 1992, and 1:40,000-scale, black-and-white, aerial photographs taken in 1996. Most of the contacts on the map were transferred from the aerial photographs using a photogrammetric stereo-plotter and subsequently field checked for accuracy and revised based on field determination of allostratigraphic and lithostratigraphic units. Determination of lithostratigraphic units in volcanic deposits was aided by geochemical data, 40Ar/39Ar geochronology, aeromagnetic and paleomagnetic data. Supplemental revision of mapped contacts was based on interpretation of USGS 1-meter orthoimagery.

  18. A simulation model for wind energy storage systems. Volume 3: Program descriptions

    NASA Technical Reports Server (NTRS)

    Warren, A. W.; Edsinger, R. W.; Burroughs, J. D.

    1977-01-01

    Program descriptions, flow charts, and program listings for the SIMWEST model generation program, the simulation program, the file maintenance program, and the printer plotter program are given. For Vol 2, see .

  19. A study of digital holographic filter generation

    NASA Technical Reports Server (NTRS)

    Calhoun, M.; Ingels, F.

    1976-01-01

    Problems associated with digital computer generation of holograms are discussed along with a criteria for producing optimum digital holograms. This criteria revolves around amplitude resolution and spatial frequency limitations induced by the computer and plotter process.

  20. ADMAP (automatic data manipulation program)

    NASA Technical Reports Server (NTRS)

    Mann, F. I.

    1971-01-01

    Instructions are presented on the use of ADMAP, (automatic data manipulation program) an aerospace data manipulation computer program. The program was developed to aid in processing, reducing, plotting, and publishing electric propulsion trajectory data generated by the low thrust optimization program, HILTOP. The program has the option of generating SC4020 electric plots, and therefore requires the SC4020 routines to be available at excution time (even if not used). Several general routines are present, including a cubic spline interpolation routine, electric plotter dash line drawing routine, and single parameter and double parameter sorting routines. Many routines are tailored for the manipulation and plotting of electric propulsion data, including an automatic scale selection routine, an automatic curve labelling routine, and an automatic graph titling routine. Data are accepted from either punched cards or magnetic tape.

  1. Using Geocoded Databases in Teaching Urban Historical Geography.

    ERIC Educational Resources Information Center

    Miller, Roger P.

    1986-01-01

    Provides information regarding hardware and software requirements for using geocoded databases in urban historical geography. Reviews 11 IBM and Apple Macintosh database programs and describes the pen plotter and digitizing table interface used with the databases. (JDH)

  2. Astronomy Graphics.

    ERIC Educational Resources Information Center

    Hubin, W. N.

    1982-01-01

    Various microcomputer-generated astronomy graphs are presented, including those of constellations and planetary motions. Graphs were produced on a computer-driver plotter and then reproduced for class use. Copies of the programs that produced the graphs are available from the author. (Author/JN)

  3. A Simple Huckel Molecular Orbital Plotter

    ERIC Educational Resources Information Center

    Ramakrishnan, Raghunathan

    2013-01-01

    A program is described and presented to readily plot the molecular orbitals from a Huckel calculation. The main features of the program and the scope of its applicability are discussed through some example organic molecules. (Contains 2 figures.)

  4. VAX-Gerber node link. Revision 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isobe, G.W.

    1985-12-01

    A communications link between the CADDE VAX 11/750 and the Gerber Photo-Plotter 4135 was desired at LLNL. The process of creating this link is discussed and the features of this project are described. 4 figs.

  5. Three-dimensional plotter technology for fabricating polymeric scaffolds with micro-grooved surfaces.

    PubMed

    Son, JoonGon; Kim, GeunHyung

    2009-01-01

    Various mechanical techniques have been used to fabricate biomedical scaffolds, including rapid prototyping (RP) devices that operate from CAD files of the target feature information. The three-dimensional (3-D) bio-plotter is one RP system that can produce design-based scaffolds with good mechanical properties for mimicking cartilage and bones. However, the scaffolds fabricated by RP have very smooth surfaces, which tend to discourage initial cell attachment. Initial cell attachment, migration, differentiation and proliferation are strongly dependent on the chemical and physical characteristics of the scaffold surface. In this study, we propose a new 3-D plotting method supplemented with a piezoelectric system for fabricating surface-modified scaffolds. The effects of the physically-modified surface on the mechanical and hydrophilic properties were investigated, and the results of cell culturing of chondrocytes indicate that this technique is a feasible new method for fabricating high-quality 3-D polymeric scaffolds.

  6. QUICK - AN INTERACTIVE SOFTWARE ENVIRONMENT FOR ENGINEERING DESIGN

    NASA Technical Reports Server (NTRS)

    Schlaifer, R. S.

    1994-01-01

    QUICK provides the computer user with the facilities of a sophisticated desk calculator which can perform scalar, vector and matrix arithmetic, propagate conic orbits, determine planetary and satellite coordinates and perform other related astrodynamic calculations within a Fortran-like environment. QUICK is an interpreter, therefore eliminating the need to use a compiler or a linker to run QUICK code. QUICK capabilities include options for automated printing of results, the ability to submit operating system commands on some systems, and access to a plotting package (MASL)and a text editor without leaving QUICK. Mathematical and programming features of QUICK include the ability to handle arbitrary algebraic expressions, the capability to define user functions in terms of other functions, built-in constants such as pi, direct access to useful COMMON areas, matrix capabilities, extensive use of double precision calculations, and the ability to automatically load user functions from a standard library. The MASL (The Multi-mission Analysis Software Library) plotting package, included in the QUICK package, is a set of FORTRAN 77 compatible subroutines designed to facilitate the plotting of engineering data by allowing programmers to write plotting device independent applications. Its universality lies in the number of plotting devices it puts at the user's disposal. The MASL package of routines has proved very useful and easy to work with, yielding good plots for most new users on the first or second try. The functions provided include routines for creating histograms, "wire mesh" surface plots and contour plots as well as normal graphs with a large variety of axis types. The library has routines for plotting on cartesian, polar, log, mercator, cyclic, calendar, and stereographic axes, and for performing automatic or explicit scaling. The lengths of the axes of a plot are completely under the control of the program using the library. Programs written to use the MASL subroutines can be made to output to the Calcomp 1055 plotter, the Hewlett-Packard 2648 graphics terminal, the HP 7221, 7475 and 7550 pen plotters, the Tektronix 40xx and 41xx series graphics terminals, the DEC VT125/VT240 graphics terminals, the QMS 800 laser printer, the Sun Microsystems monochrome display, the Ridge Computers monochrome display, the IBM/PC color display, or a "dumb" terminal or printer. Programs using this library can be written so that they always use the same type of plotter or they can allow the choice of plotter type to be deferred until after program execution. QUICK is written in RATFOR for use on Sun4 series computers running SunOS. No source code is provided. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in ASCII format is included on the distribution medium. QUICK was developed in 1991 and is a copyrighted work with all copyright vested in NASA.

  7. Clinical application of a light-pen computer system for quantitative angiography

    NASA Technical Reports Server (NTRS)

    Alderman, E. L.

    1975-01-01

    The paper describes an angiographic analysis system which uses a video disk for recording and playback, a light-pen for data input, minicomputer processing, and an electrostatic printer/plotter for hardcopy output. The method is applied to quantitative analysis of ventricular volumes, sequential ventriculography for assessment of physiologic and pharmacologic interventions, analysis of instantaneous time sequence of ventricular systolic and diastolic events, and quantitation of segmental abnormalities. The system is shown to provide the capability for computation of ventricular volumes and other measurements from operator-defined margins by greatly reducing the tedium and errors associated with manual planimetry.

  8. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual. [NURE program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From thismore » analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.« less

  9. The QDP/PLT user's guide

    NASA Technical Reports Server (NTRS)

    Tennant, Allyn F.

    1991-01-01

    PLT is a high level plotting package. A Programmer can create a default plot suited for the data being displayed. At run times, users can then interact with the plot overriding any or all of these defaults. The user is also provided the capability to fit functions to the displayed data. This ability to display, interact with, and to fit the data make PLT a useful tool in the analysis of data. The Quick and Dandy Plotter (QDP) program will read ASCII text files that contain PLT commands and data. Thus, QDP provides and easy way to use the PLT software QPD files provide a convenient way to exchange data. The QPD/PLT software is written in standard FORTRAN 77 and has been ported to VAX VMS, SUN UNIX, IBM AIX, NeXT NextStep, and MS-DOS systems.

  10. Effect of support flexibilty and damping on the dynamic response of a single mass flexible rotor in elastic bearings

    NASA Technical Reports Server (NTRS)

    Kirk, R. G.; Gunter, E. J.

    1972-01-01

    A steady state analysis of the shaft and the bearing housing motion was made by assuming synchronous precession of the system. The conditions under which the support system would act as a dynamic vibration absorber at the rotor critical speed were studied; plots of the rotor and support amplitudes, phase angles, and forces transmitted were evaluated by the computer, and the performance curves were automatically plotted by a CalComp plotter unit. Curves are presented on the optimization of the support housing characteristics to attenuate the rotor unbalance response over the entire rotor speed range. The complete transient motion including rotor unbalance was examined by integrating the equations of motion numerically using a modified fourth order Runge-Kutta procedure, and the resulting whirl orbits were plotted by the CalComp plotter unit. The results of the transient analysis are discussed with regards to the design optimization procedure derived from the steady-state analysis.

  11. Development of a data management front end for use with a LANDSAT based information system. [assessing gypsy moth defoliation damage in Pennsylvania

    NASA Technical Reports Server (NTRS)

    Turner, B. J. (Principal Investigator)

    1982-01-01

    A user friendly front end was constructed to facilitate access to the LANDSAT mosaic data base supplied by JPL and to process both LANDSAT and ancillary data. Archieval and retrieval techniques were developed to efficiently handle this data base and make it compatible with requirements of the Pennsylvania Bureau of Forestry. Procedures are ready for: (1) forming the forest/nonforest mask in ORSER compressed map format using GSFC-supplied classification procedures; (2) registering data from a new scene (defoliated) to the mask (which may involve mosaicking if the area encompasses two LANDSAT scenes; (3) producing a masked new data set using the MASK program; (4) analyzing this data set to produce a map showing degrees of defoliation, output on the Versatec plotter; and (5) producing color composite maps by a diazo-type process.

  12. User's manual for THPLOT, A FORTRAN 77 Computer program for time history plotting

    NASA Technical Reports Server (NTRS)

    Murray, J. E.

    1982-01-01

    A general purpose FORTRAN 77 computer program (THPLOT) for plotting time histories using Calcomp pen plotters is described. The program is designed to read a time history data file and to generate time history plots for selected time intervals and/or selected data channels. The capabilities of the program are described. The card input required to define the plotting operation is described and examples of card input and the resulting plotted output are given. The examples are followed by a description of the printed output, including both normal output and error messages. Lastly, implementation of the program is described. A complete listing of the program with reference maps produced by the CDC FTN 5.0 compiler is included.

  13. Some Automated Cartography Developments at the Defense Mapping Agency.

    DTIC Science & Technology

    1981-01-01

    on a pantographic router creating a laminate step model which was moulded in plaster for carving Into a terrain model. This section will trace DMA’s...offering economical automation. Precision flatbed Concord plotters were brought into DMA with sufficiently programmable control computers to perform these

  14. Laboratory Connections: Review of Two Commercial Interfacing Packages.

    ERIC Educational Resources Information Center

    Powers, Michael H.

    1989-01-01

    Evaluates two Apple II interfacing packages designed to measure pH: (1) "Experiments in Chemistry" by HRM Software and (2) "Voltage Plotter III" by Vernier Software. Provides characteristics and screen dumps of each package. Reports both systems are suitable for high school or beginning college laboratories. (MVL)

  15. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  16. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  17. Oklahoma's Mobile Computer Graphics Laboratory.

    ERIC Educational Resources Information Center

    McClain, Gerald R.

    This Computer Graphics Laboratory houses an IBM 1130 computer, U.C.C. plotter, printer, card reader, two key punch machines, and seminar-type classroom furniture. A "General Drafting Graphics System" (GDGS) is used, based on repetitive use of basic coordinate and plot generating commands. The system is used by 12 institutions of higher education…

  18. Quality Improvement: Does the Air Force Systems Command Practice What It Preaches

    DTIC Science & Technology

    1990-03-01

    without his assistance in getting supplies, computers, and plotters. Another special thanks goes to my committee chairman. Dr Stephen Blank. who provided...N.J.: Prentice-Hall. 1986). 166. 5. Ibid.. 181. 6. Sidney Siegel. Nonparametric Statistics for the Behavioral Sciences (New York: Mc- Graw -Hill. 1956

  19. Project Solo; Newsletter Number Seven.

    ERIC Educational Resources Information Center

    Pittsburgh Univ., PA. Project Solo.

    The current curriculum modules under development at Project Solo are listed. The modules are grouped under the subject matter that they are designed to teach--algebra II, biology, calculus, chemistry, computer science, 12th grade math, physics, social science. Special programs written for use on the Hewlett-Packard Plotter are listed that may be…

  20. U.S. Strategic Communication Policy Toward the South American Andean Ridge

    DTIC Science & Technology

    2012-02-17

    military’s links to paramilitary groups. Nonetheless, just before President Alvaro Uribe visited President Bush in August 2005, the State...coup plotters only to find that Chavez was back in power. In Colombia, Uribe successfully moved to change the Colombian constitution to allow for his

  1. PPFIA1 is upregulated in liver metastasis of breast cancer and is a potential poor prognostic indicator of metastatic relapse.

    PubMed

    Yang, Jing; Wu, Ning-Ni; Huang, De-Jia; Luo, Yao-Chang; Huang, Jun-Zhen; He, Hai-Yuan; Lu, Hai-Lin; Song, Wen-Ling

    2017-07-01

    Although the oncogenic role of PPFIA1 (liprin-α1) in breast cancer has been reported, whether its dysregulation is associated with metastasis risk or survival outcomes in breast cancer patients is not clear. Our primary data showed that PPFIA1 expression was significantly higher in liver metastatic breast tumors than in the primary tumors. Then, we tried to pool previous annotated genomic data to assess the prognostic value of PPFIA1 in distant metastasis-free survival, the risk of metastatic relapse, and metastatic relapse-free survival in breast cancer patients by data mining in two large databases, Kaplan-Meier plotter and bc-GenExMiner 4.0. Results from Kaplan-Meier plotter showed that although high PPFIA1 expression was generally associated with decreased distant metastasis-free survival in estrogen receptor+ patients, subgroup analysis only confirmed significant association in estrogen receptor+/N- (nodal negative) group (median survival, high PPFIA1 group vs low PPFIA1 cohort: 191.21 vs 236.22 months; hazard ratio: 2.23, 95% confidence interval: 1.42-3.5, p < 0.001), but not in estrogen receptor+/N+ (nodal positive) group (hazard ratio: 1.63, 95% confidence interval: 0.88-3.03, p = 0.12). In estrogen receptor- patients, there was no association between PPFIA1 expression and distant metastasis-free survival, no matter in Nm (nodal status mixed), N-, or N+ subgroups. In bc-GenExMiner 4.0, Nottingham Prognostic Index- and Adjuvant! Online-adjusted analysis validated the independent prognostic value of PPFIA1 in metastatic risks in estrogen receptor+/N- patients. Based on these findings, we infer that high PPFIA1 expression might be an independent prognostic indicator of increased metastatic relapse risk in patients with estrogen receptor+/N- breast cancer, but not in estrogen receptor+/N+ or estrogen receptor- patients.

  2. Lead isotope data bank; 2,624 samples and analyses cited

    USGS Publications Warehouse

    Doe, Bruce R.

    1976-01-01

    The Lead Isotope Data Bank (LIDB) was initiated to facilitate plotting data. Therefore, the Bank reflects data most often used in plotting rather than comprises a comprehensive tabulation of lead isotope data. Up until now, plotting was done using card decks processed by computer with tapes plotted by a Gerber plotter and more recently a CRT using a batch mode. Lack of a uniform format for sample identification was not a great impediment. With increase in the size of the bank, hand sorting is becoming prohibitive and ·plans are underway to put the bank into a uniform format on DISK with a card backup so that it may be accessed by use of IRIS on the DECK 10 computer at the U.S.G.S. facility in Denver. Plots will be constructed on a CRT. Entry of the bank into the IRIS accessing program is scheduled for completion in FY 1976

  3. Computer user's manual for a generalized curve fit and plotting program

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.; Beadle, B. D., II; Dolerhie, B. D., Jr.; Owen, J. W.

    1973-01-01

    A FORTRAN coded program has been developed for generating plotted output graphs on 8-1/2 by 11-inch paper. The program is designed to be used by engineers, scientists, and non-programming personnel on any IBM 1130 system that includes a 1627 plotter. The program has been written to provide a fast and efficient method of displaying plotted data without having to generate any additions. Various output options are available to the program user for displaying data in four different types of formatted plots. These options include discrete linear, continuous, and histogram graphical outputs. The manual contains information about the use and operation of this program. A mathematical description of the least squares goodness of fit test is presented. A program listing is also included.

  4. Digital image transformation and rectification of spacecraft and radar images

    USGS Publications Warehouse

    Wu, S.S.C.

    1985-01-01

    Digital image transformation and rectification can be described in three categories: (1) digital rectification of spacecraft pictures on workable stereoplotters; (2) digital correction of radar image geometry; and (3) digital reconstruction of shaded relief maps and perspective views including stereograms. Digital rectification can make high-oblique pictures workable on stereoplotters that would otherwise not accommodate such extreme tilt angles. It also enables panoramic line-scan geometry to be used to compile contour maps with photogrammetric plotters. Rectifications were digitally processed on both Viking Orbiter and Lander pictures of Mars as well as radar images taken by various radar systems. By merging digital terrain data with image data, perspective and three-dimensional views of Olympus Mons and Tithonium Chasma, also of Mars, are reconstructed through digital image processing. ?? 1985.

  5. Topoclimatic aspects of developmental suitability in the metropolitan landscape

    Treesearch

    Spencer A., Jr. Joyner; Raymond S. Bradley; Robert E., Jr. Reiter

    1977-01-01

    A computer-based procedure for geographically identifying rating, and ranking topoclimatic characteristics is described. The influences of topography, land use, and soils are considered and combined into a single composite topoclimate developmental suitability map drawn by a Cal Comp plotter. By allocating development to the most suitable topoclimate areas, the long-...

  6. Credibility of the threat from a radiological dispersal device by terrorists within the United States

    DTIC Science & Technology

    2016-06-10

    on assessing the probability of an RDD attack, otherwise known as a “dirty bomb ,” within the US and its territories. Currently, there is an...officials arrested plotters planning to employ a dirty bomb utilizing americium obtained from smoke detectors.7 Officials thought it extremely unlikely

  7. Simulating forest pictures by impact printers

    Treesearch

    Elliot L. Amidon; E. Joyce Dye

    1978-01-01

    Two mechanical devices that are mainly used to print computer output in text form can simulate pictures of terrain and forests. The line printer, which is available for batch processing at many computer installations, can approximate halftones by using overstruck characters to produce successively larger "dots." The printer/plotter, which is normally used as...

  8. An Architectural Design System Based on Computer Graphics.

    ERIC Educational Resources Information Center

    MacDonald, Stephen L.; Wehrli, Robert

    The recent developments in computer hardware and software are presented to inform architects of this design tool. Technical advancements in equipment include--(1) cathode ray tube displays, (2) light pens, (3) print-out and photo copying attachments, (4) controls for comparison and selection of images, (5) chording keyboards, (6) plotters, and (7)…

  9. Cool-and Unusual-CAD Applications

    ERIC Educational Resources Information Center

    Calhoun, Ken

    2004-01-01

    This article describes several very useful applications of AutoCAD that may lie outside the normal scope of application. AutoCAD commands used in this article are based on AutoCAD 2000I. The author and his students used a Hewlett Packard 750C DesignJet plotter for plotting. (Contains 5 figures and 5 photos.)

  10. Computer Exercises in Systems and Fields Experiments

    ERIC Educational Resources Information Center

    Bacon, C. M.; McDougal, J. R.

    1971-01-01

    Laboratory activities give students an opportunity to interact with computers in modes ranging from remote terminal use in laboratory experimentation to the direct hands-on use of a small digital computer with disk memory and on-line plotter, and finally to the use of a large computer under closed-shop operation. (Author/TS)

  11. Radar, target and ranging

    NASA Astrophysics Data System (ADS)

    1984-09-01

    This Test Operations Procedure (TOP) provides conventional test methods employing conventional test instrumentation for testing conventional radars. Single tests and subtests designed to test radar components, transmitters, receivers, antennas, etc., and system performance are conducted with single item instruments such as meters, generators, attenuators, counters, oscillators, plotters, etc., and with adequate land areas for conducting field tests.

  12. Analysis of a dual-reflector antenna system using physical optics and digital computers

    NASA Technical Reports Server (NTRS)

    Schmidt, R. F.

    1972-01-01

    The application of physical-optics diffraction theory to a deployable dual-reflector geometry is discussed. The methods employed are not restricted to the Conical-Gregorian antenna, but apply in a general way to dual and even multiple reflector systems. Complex vector wave methods are used in the Fresnel and Fraunhofer regions of the reflectors. Field amplitude, phase, polarization data, and time average Poynting vectors are obtained via an IBM 360/91 digital computer. Focal region characteristics are plotted with the aid of a CalComp plotter. Comparison between the GSFC Huygens wavelet approach, JPL measurements, and JPL computer results based on the near field spherical wave expansion method are made wherever possible.

  13. Method and apparatus for measuring areas of photoelectric cells and photoelectric cell performance parameters

    DOEpatents

    Osterwald, C.R.; Emery, K.A.

    1984-05-29

    A laser scanning system for scanning the surface of photovoltaic cell in a precise, stepped raster pattern includes electric current detecting and measuring equipment for sensing the current response of the scanned cell to the laser beam at each stepped irradiated spot or pixel on the cell surface. A computer is used to control and monitor the raster position of the laser scan as well as monitoring the corresponding current responses, storing this data, operating on it, and for feeding the data to a graphical plotter for producing a visual, color-coded image of the current response of the cell to the laser scan. A translation platform driven by stepper motors in precise X and Y distances holds and rasters the cell being scanned under a stationary spot-focused laser beam.

  14. Method and apparatus for measuring areas of photoelectric cells and photoelectric cell performance parameters

    DOEpatents

    Osterwald, Carl R.; Emery, Keith A.

    1987-01-01

    A laser scanning system for scanning the surface of a photovoltaic cell in a precise, stepped raster pattern includes electric current detecting and measuring equipment for sensing the current response of the scanned cell to the laser beam at each stepped irradiated spot or pixel on the cell surface. A computer is used to control and monitor the raster position of the laser scan as well as monitoring the corresponding current responses, storing this data, operating on it, and for feeding the data to a graphic plotter for producing a visual, color-coded image of the current response of the cell to the laser scan. A translation platform driven by stepper motors in precise X and Y distances holds and rasters the cell being scanned under a stationary spot-focused laser beam.

  15. U.S. Army Natick Soldier Research, Development & Engineering Center Testing Facilities And Equipment. Second Edition

    DTIC Science & Technology

    2011-04-01

    30 Freeze Dryer ................................................. 30 High-Pressure Processing ............................... 30 Microwave Digestive...PP1 Power Platform Energy Analyzer ..... 41 Quintox Gas Combustion Analyzer .................... 41 FLIR Systems SC2000 Thermacam Handheld IR ...electronically directly to the contractor or printed on plotter paper , oak tag, or on CD. alloy steel, stainless steel, aluminum, copper and copper alloys

  16. A Comparison of Product Realization Frameworks

    DTIC Science & Technology

    1993-10-01

    software (integrated FrameMaker ). Also included are BOLD for on-line documentation delivery, printer/plotter support, and 18 network licensing support. AMPLE...are built with DSS. Documentation tools include an on-line information system (BOLD), text editing (Notepad), word processing (integrated FrameMaker ...within an application. FrameMaker is fully integrated with the Falcon Framework to provide consistent documentation capabilities within engineering

  17. HP-9810A calculator programs for plotting the 2-dimensional motion of cyclindrical payloads relative to the shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Wilson, S. W.

    1976-01-01

    The HP-9810A calculator programs described provide the capability to generate HP-9862A plotter displays which depict the apparent motion of a free-flying cyclindrical payload relative to the shuttle orbiter body axes by projecting the payload geometry into the orbiter plane of symmetry at regular time intervals.

  18. User's manual for the VAX-Gerber link software package. Revision 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isobe, G.W.

    1985-10-01

    This manual provides a user the information necessary to run the VAX-Gerber link software package. It is expected that the user already knows how to login to the VAX, and is familiar with the Gerber Photo Plotter. It is also highly desirable that the user be familiar with the full screen editor on the VAX, EDT.

  19. Power and Energy Considerations at Forward Operating Bases (FOBs)

    DTIC Science & Technology

    2010-06-16

    systems • Anticipated additional plug loads by users – Personal Computers and Gaming Devices – Coffee Pots – Refrigerators – Lights – Personal Heaters...effort was made to account for the significant amount of equipment that consumes power not on the unit’s MTOE (printers, plotters, coffee pots, etc...50 Warfighters including billeting, kitchen, laundry, shower, latrines, and new wastewater treatment system Capability/impact: Compact, lightweight

  20. Computer program for calculating and plotting fire direction and rate of spread.

    Treesearch

    James E. Eenigenburg

    1987-01-01

    Presents an analytical procedure that uses a FORTRAN 77 program to estimate fire direction and rate of spread. The program also calculates the variability of these parameters, both for subsections of the fire and for the fires as a whole. An option in the program allows users with a CALCOMP plotter to obtain a map of the fire with spread vectors.

  1. A computer graphics display and data compression technique

    NASA Technical Reports Server (NTRS)

    Teague, M. J.; Meyer, H. G.; Levenson, L. (Editor)

    1974-01-01

    The computer program discussed is intended for the graphical presentation of a general dependent variable X that is a function of two independent variables, U and V. The required input to the program is the variation of the dependent variable with one of the independent variables for various fixed values of the other. The computer program is named CRP, and the output is provided by the SD 4060 plotter. Program CRP is an extremely flexible program that offers the user a wide variety of options. The dependent variable may be presented in either a linear or a logarithmic manner. Automatic centering of the plot is provided in the ordinate direction, and the abscissa is scaled automatically for a logarithmic plot. A description of the carpet plot technique is given along with the coordinates system used in the program. Various aspects of the program logic are discussed and detailed documentation of the data card format is presented.

  2. Alaska Interim Land Cover Mapping Program; final report

    USGS Publications Warehouse

    Fitzpatrick-Lins, Katherine; Doughty, E.F.; Shasby, Mark; Benjamin, Susan

    1989-01-01

    In 1985, the U.S. Geological Survey initiated a research project to develop an interim land cover data base for Alaska as an alternative to the nationwide Land Use and Land Cover Mapping Program. The Alaska Interim Land Cover Mapping Program was subsequently created to develop methods for producing a series of land cover maps that utilized the existing Landsat digital land cover classifications produced by and for the major land management agencies for mapping the vegetation of Alaska. The program was successful in producing digital land cover classifications and statistical summaries using a common statewide classification and in reformatting these data to produce l:250,000-scale quadrangle-based maps directly from the Scitex laser plotter. A Federal and State agency review of these products found considerable user support for the maps. Presently the Geological Survey is committed to digital processing of six to eight quadrangles each year.

  3. Van Allen Probes Science Gateway and Space Weather Data Processing

    NASA Astrophysics Data System (ADS)

    Romeo, G.; Barnes, R. J.; Weiss, M.; Fox, N. J.; Mauk, B.; Potter, M.; Kessel, R.

    2014-12-01

    The Van Allen Probes Science Gateway acts as a centralized interface to the instrument Science Operation Centers (SOCs), provides mission planning tools, and hosts a number of science related activities such as the mission bibliography. Most importantly, the Gateway acts as the primary site for processing and delivering the VAP Space Weather data to users. Over the past year, the web-site has been completely redesigned with the focus on easier navigation and improvements of the existing tools such as the orbit plotter, position calculator and magnetic footprint tool. In addition, a new data plotting facility has been added. Based on HTML5, which allows users to interactively plot Van Allen Probes summary and space weather data. The user can tailor the tool to display exactly the plot they wish to see and then share this with other users via either a URL or by QR code. Various types of plots can be created, including simple time series, data plotted as a function of orbital location, and time versus L-Shell. We discuss the new Van Allen Probes Science Gateway and the Space Weather Data Pipeline.

  4. Computer-generated mineral commodity deposit maps

    USGS Publications Warehouse

    Schruben, Paul G.; Hanley, J. Thomas

    1983-01-01

    This report describes an automated method of generating deposit maps of mineral commodity information. In addition, it serves as a user's manual for the authors' mapping system. Procedures were developed which allow commodity specialists to enter deposit information, retrieve selected data, and plot deposit symbols in any geographic area within the conterminous United States. The mapping system uses both micro- and mainframe computers. The microcomputer is used to input and retrieve information, thus minimizing computing charges. The mainframe computer is used to generate map plots which are printed by a Calcomp plotter. Selector V data base system is employed for input and retrieval on the microcomputer. A general mapping program (Genmap) was written in FORTRAN for use on the mainframe computer. Genmap can plot fifteen symbol types (for point locations) in three sizes. The user can assign symbol types to data items interactively. Individual map symbols can be labeled with a number or the deposit name. Genmap also provides several geographic boundary file and window options.

  5. Coverage by land, sea, and airplane surveys, 1900-1967.

    NASA Technical Reports Server (NTRS)

    Fabiano, E.; Cain, S. J.

    1971-01-01

    The worldwide coverage of the earth by land, sea, and aircraft magnetic surveys since the beginning of the 20th century is shown on three world maps for surface surveys spanning the periods of 1900-1930, 1930-1955, and 1955-1967, respectively, on a fourth map for ship-towed magnetometer surveys performed after 1956, and on a fifth map for 1953-1966 airborne survey data. The technique used, involving a position plotting of each measurement with a microfilm plotter, results in the appearance of heavily surveyed regions as completely darkened areas. The coverage includes measurements at about 100,000 land stations, airborne measurements at over 90,000 points, and marine measurements at over 25,000 points. The marine measurements cover over 1,000,000 km of trackline.

  6. Validation of GC and HPLC systems for residue studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M.

    1995-12-01

    For residue studies, GC and HPLC system performance must be validated prior to and during use. One excellent measure of system performance is the standard curve and associated chromatograms used to construct that curve. The standard curve is a model of system response to an analyte over a specific time period, and is prima facia evidence of system performance beginning at the auto sampler and proceeding through the injector, column, detector, electronics, data-capture device, and printer/plotter. This tool measures the performance of the entire chromatographic system; its power negates most of the benefits associated with costly and time-consuming validation ofmore » individual system components. Other measures of instrument and method validation will be discussed, including quality control charts and experimental designs for method validation.« less

  7. Microcumpter computation of water quality discharges

    USGS Publications Warehouse

    Helsel, Dennis R.

    1983-01-01

    A fully prompted program (SEDQ) has been developed to calculate daily and instantaneous water quality (QW) discharges. It is written in a version of BASIC, and requires inputs of gage heights, discharge rating curve, shifts, and water quality concentration information. Concentration plots may be modified interactively using the display screen. Semi-logarithmic plots of concentration and water quality discharge are output to the display screen, and optionally to plotters. A summary table of data is also output. SEDQ could be a model program for micro and minicomputer systems likely to be in use within the Water Resources Division, USGS, in the near future. The daily discharge-weighted mean concentration is one output from SEDQ. It is defined in this report, differentiated from the currently used mean concentration, and designated the ' equivalent concentration. ' (USGS)

  8. Engineering simulation development and evaluation of the two-segment noise abatement approach conducted in the B-727-222 flight simulator

    NASA Technical Reports Server (NTRS)

    Nylen, W. E.

    1974-01-01

    Profile modification as a means of reducing ground level noise from jet aircraft in the landing approach is evaluated. A flight simulator was modified to incorporate the cockpit hardware which would be in the prototype airplane installation. The two-segment system operational and aircraft interface logic was accurately emulated in software. Programs were developed to permit data to be recorded in real time on the line printer, a 14-channel oscillograph, and an x-y plotter. The two-segment profile and procedures which were developed are described with emphasis on operational concepts and constraints. The two-segment system operational logic and the flight simulator capabilities are described. The findings influenced the ultimate system design and aircraft interface.

  9. Photogrammetry of Apollo 15 photography, part C

    NASA Technical Reports Server (NTRS)

    Wu, S. S. C.; Schafer, F. J.; Jordan, R.; Nakata, G. M.; Derick, J. L.

    1972-01-01

    In the Apollo 15 mission, a mapping camera system and a 61 cm optical bar, high resolution panoramic camera, as well as a laser altimeter were used. The panoramic camera is described, having several distortion sources, such as cylindrical shape of the negative film surface, the scanning action of the lens, the image motion compensator, and the spacecraft motion. Film products were processed on a specifically designed analytical plotter.

  10. Tactile communication using a CO(2) flux stimulation for blind or deafblind people.

    PubMed

    da Cunha, Jose Carlos; Bordignon, Luiz Alberto; Nohama, Percy

    2010-01-01

    This paper describes a tactile stimulation system for producing nonvisual image patterns to blind or deafblind people. The stimulator yields a CO(2) pulsatile flux directed to the user's skin throughout a needle that is coupled to a 2-D tactile plotter. The fluxtactile plotter operates with two step motor mounted on a wood structure, controlled by a program developed to produce alphanumerical characters and geometric figures of different size and speed, which will be used to investigate the psychophysical properties of this kind of tactile communication. CO(2) is provided by a cylinder that delivers a stable flux, which is converted to a pulsatile mode through a high frequency solenoid valve that can chop it up to 1 kHz. Also, system temperature is controlled by a Peltier based device. Tests on the prototype indicate that the system is a valuable tool to investigate the psychophysical properties of the skin in response to stimulation by CO(2) jet, allowing a quantitative and qualitative analysis as a function of stimulation parameters. With the system developed, it was possible to plot the geometric figures proposed: triangles, rectangles and octagons, in different sizes and speeds, and verify the control of the frequency of CO(2) jet stimuli.

  11. Immigrant Integration: A Missing Component of Homeland Security Strategy and Policy

    DTIC Science & Technology

    2010-03-01

    Kobach, 2007). JFK airport in New York (Kobach, 2007) The four JFK terrorists include two nationals of Guyana, one of Trinidad, and one former...words of the terrorist themselves. In one conversation taped by the FBI, Defreitas (the lead plotter of the thwarted attack at JFK airport in...another recorded conversation with his conspirators in May 2007, Defreitas compared the plot to attack JFK airport with the September 11, 2001

  12. U.S.-China Counterterrorism Cooperation: Issues for U.S. Policy

    DTIC Science & Technology

    2010-07-08

    2005 and 2006 raised U.S. concerns, despite the SCO’s claim to be a counterterrorism group. In addition to Mongolia, the countries of India , Pakistan ... Pakistan to counter terrorists and the Taliban increased after the attack in Mumbai, India , in November 2008. Pakistan’s Interior Minister confirmed...in February 2009 that some of plotters were in Pakistan . The CIA reportedly brokered intelligence-sharing between India and Pakistan .130 Also in

  13. HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis

    USGS Publications Warehouse

    Sloto, Ronald A.; Crouse, Michele Y.

    1996-01-01

    HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.

  14. Producing Alaska interim land cover maps from Landsat digital and ancillary data

    USGS Publications Warehouse

    Fitzpatrick-Lins, Katherine; Doughty, Eileen Flanagan; Shasby, Mark; Loveland, Thomas R.; Benjamin, Susan

    1987-01-01

    In 1985, the U.S. Geological Survey initiated a research program to produce 1:250,000-scale land cover maps of Alaska using digital Landsat multispectral scanner data and ancillary data and to evaluate the potential of establishing a statewide land cover mapping program using this approach. The geometrically corrected and resampled Landsat pixel data are registered to a Universal Transverse Mercator (UTM) projection, along with arc-second digital elevation model data used as an aid in the final computer classification. Areas summaries of the land cover classes are extracted by merging the Landsat digital classification files with the U.S. Bureau of Land Management's Public Land Survey digital file. Registration of the digital land cover data is verified and control points are identified so that a laser plotter can products screened film separate for printing the classification data at map scale directly from the digital file. The final land cover classification is retained both as a color map at 1:250,000 scale registered to the U.S. Geological Survey base map, with area summaries by township and range on the reverse, and as a digital file where it may be used as a category in a geographic information system.

  15. MOLECULAR DESIGNER: an interactive program for the display of protein structure on the IBM-PC.

    PubMed

    Hannon, G J; Jentoft, J E

    1985-09-01

    A BASIC interactive graphics program has been developed for the IBM-PC which utilizes the graphics capabilities of that computer to display and manipulate protein structure from coordinates. Structures may be generated from typed files, or from Brookhaven National Laboratories' Protein Data Bank data tapes. Once displayed, images may be rotated, translated and expanded to any desired size. Figures may be viewed as ball-and-stick or space-filling models. Calculated multiple-point perspective may also be added to the display. Docking manipulations are possible since more than a single figure may be displayed and manipulated simultaneously. Further, stereo images and red/blue three-dimensional images may be generated using the accompanying DESIPLOT program and an HP-7475A plotter. A version of the program is also currently available for the Apple Macintosh. Full implementation on the Macintosh requires 512 K and at least one disk drive. Otherwise this version is essentially identical to the IBM-PC version described herein.

  16. Grid-coordinate generation program

    USGS Publications Warehouse

    Cosner, Oliver J.; Horwich, Esther

    1974-01-01

    This program description of the grid-coordinate generation program is written for computer users who are familiar with digital aquifer models. The program computes the coordinates for a variable grid -used in the 'Pinder Model' (a finite-difference aquifer simulator), for input to the CalComp GPCP (general purpose contouring program). The program adjusts the y-value by a user-supplied constant in order to transpose the origin of the model grid from the upper left-hand corner to the lower left-hand corner of the grid. The user has the options of, (1.) choosing the boundaries of the plot; (2.) adjusting the z-values (altitudes) by a constant; (3.) deleting superfluous z-values and (4.) subtracting the simulated surfaces from each other to obtain the decline. Output of this program includes the fixed format CNTL data cards and the other data cards required for input to GPCP. The output from GPCP then is used to produce a potentiometric map or a decline map by means of the CalComp plotter.

  17. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gammamore » transport with multi-temperature treatment, static eigenvalue (k eff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.« less

  18. Refactoring DIRT

    NASA Astrophysics Data System (ADS)

    Amarnath, N. S.; Pound, M. W.; Wolfire, M. G.

    The Dust InfraRed ToolBox (DIRT - a part of the Web Infrared ToolShed, or WITS, located at http://dustem.astro.umd.edu) is a Java applet for modeling astrophysical processes in circumstellar shells around young and evolved stars. DIRT has been used by the astrophysics community for about 4 years. DIRT uses results from a number of numerical models of astrophysical processes, and has an AWT based user interface. DIRT has been refactored to decouple data representation from plotting and curve fitting. This makes it easier to add new kinds of astrophysical models, use the plotter in other applications, migrate the user interface to Swing components, and modify the user interface to add functionality (for example, SIRTF tools). DIRT is now an extension of two generic libraries, one of which manages data representation and caching, and the second of which manages plotting and curve fitting. This project is an example of refactoring with no impact on user interface, so the existing user community was not affected.

  19. A revised version of Graphic Normative Analysis Program (GNAP) with examples of petrologic problem solving

    USGS Publications Warehouse

    Stuckless, J.S.; VanTrump, G.

    1979-01-01

    A revised version of Graphic Normative Analysis Program (GNAP) has been developed to allow maximum flexibility in the evaluation of chemical data by the occasional computer user. GNAP calculates ClPW norms, Thornton and Tuttle's differentiation index, Barth's cations, Niggli values and values for variables defined by the user. Calculated values can be displayed graphically in X-Y plots or ternary diagrams. Plotting can be done on a line printer or Calcomp plotter with either weight percent or mole percent data. Modifications in the original program give the user some control over normative calculations for each sample. The number of user-defined variables that can be created from the data has been increased from ten to fifteen. Plotting and calculations can be based on the original data, data adjusted to sum to 100 percent, or data adjusted to sum to 100 percent without water. Analyses for which norms were previously not computable are now computed with footnotes that show excesses or deficiencies in oxides (or volatiles) not accounted for by the norm. This report contains a listing of the computer program, an explanation of the use of the program, and the two sample problems.

  20. Space shuttle: Aerodynamic characteristics of a composite booster/040A orbiter launch configuration with fin and booster body configuration effect contribution

    NASA Technical Reports Server (NTRS)

    Ainsworth, R. W.; Johnson, J. C.; Watts, L. L.

    1972-01-01

    An investigation was made of the fin configuration and booster body configuration effects on a composite booster/040A orbiter launch configuration. Aerodynamic performance and stability characteristics in pitch and yaw were obtained. Configurations tested included two stepped cylindrical bodies of different lengths with a conical nose, four fin shapes of various sizes and aspect ratios mounted in different positions around the base of the bodies, two base flare angles and three 040A orbiter configurations. The orbiter variations included a tailless configuration and two tail sizes. A tailless booster launch configuration with deflected petals (expanded flare sectors) was also tested. The model scale was 0.003366. Data were converted to coefficient form in near real time, punched on cards, and tabulated. The cards used in conjunction with a Benson-Lehner plotter were used to provide plotted data. At the end of the test, tabulated input forms were completed for the SADSAC computer program to aid in publishing the final test data report.

  1. Photogrammetry of the Viking-Lander imagery.

    USGS Publications Warehouse

    Wu, S.S.C.; Schafer, F.J.

    1982-01-01

    We have solved the problem of photogrammetric mapping from the Viking Lander photography in two ways: 1) by converting the azimuth and elevation scanning imagery to the equivalent of a frame picture by means of computerized rectification; and 2) by interfacing a high-speed, general-purpose computer to the AS-11A analytical plotter so that all computations of corrections can be performed in real time during the process of model orientation and map compilation. Examples are presented of photographs and maps of Earth and Mars. -from Authors

  2. Blast Noise Prediction. Volume II. BNOISE 3.2 Computer Program Description and Program Listing.

    DTIC Science & Technology

    1981-03-01

    tttim itit) k cii he sCli h I Apptif 4\\1111,1C’ I Lin ~Ist I Itis is tj ’it. hi ilti Ituitph inlI N’ skiLl I ink, hi k I i e II it,~ 11it I Mi...to which the point (XMIN,YMIN) will correspond SCLE Card cc 1 SCLI - PS( A L Format (A4,2X,G8.3) where. PSCALF (col 7-14t is the plotter scale factor

  3. An Assessment of the Shipboard Training Effectiveness of the Integrated Damage Control Training Technology (IDCTT) Version 3.0

    DTIC Science & Technology

    1998-03-01

    damage control actions in an assigned area of the ship. Reports are received from the On Scene Leader ( OSL ) and Investigators. Simultaneously, the RPL...control location. A phone talker and plotter will perform in unison with their counterparts in DCC. Key members of the repair party, the OSL and...the obligation of the On Scene Leader ( OSL ). This experienced petty officer is tasked with directing the ATL’s actions and informing the RPL of repair

  4. The Cooperative Engagement Capability CEC Transforming Naval Anti-air Warfare

    DTIC Science & Technology

    2007-01-01

    E-2C Aircraft Acquisition Options,” MR-1517-NAVY (Santa Monica: RAND, 2002), 10. 33 Aegis is not an acronym. The ὰιγίς ( Greek ) or ægis (Latin) was...the shield of the mythological god Zeus (Jupiter) and thus represents a sure defense. 34 For an overview of USN surface (not air) AAW...real time, without significant delay. In World War II CICs, radar operators, plotters, CIC evaluators, and FDOs acted as “ animation artists

  5. Operating System For Numerically Controlled Milling Machine

    NASA Technical Reports Server (NTRS)

    Ray, R. B.

    1992-01-01

    OPMILL program is operating system for Kearney and Trecker milling machine providing fast easy way to program manufacture of machine parts with IBM-compatible personal computer. Gives machinist "equation plotter" feature, which plots equations that define movements and converts equations to milling-machine-controlling program moving cutter along defined path. System includes tool-manager software handling up to 25 tools and automatically adjusts to account for each tool. Developed on IBM PS/2 computer running DOS 3.3 with 1 MB of random-access memory.

  6. A close-range photogrammetric technique for mapping neotectonic features in trenches

    USGS Publications Warehouse

    Fairer, G.M.; Whitney, J.W.; Coe, J.A.

    1989-01-01

    Close-range photogrammetric techniques and newly available computerized plotting equipment were used to map exploratory trench walls that expose Quaternary faults in the vicinity of Yucca Mountain, Nevada. Small-scale structural, lithologic, and stratigraphic features can be rapidly mapped by the photogrammetric method. This method is more accurate and significantly more rapid than conventional trench-mapping methods, and the analytical plotter is capable of producing cartographic definition of high resolution when detailed trench maps are necessary. -from Authors

  7. Computer program for plotting and fairing wind-tunnel data

    NASA Technical Reports Server (NTRS)

    Morgan, H. L., Jr.

    1983-01-01

    A detailed description of the Langley computer program PLOTWD which plots and fairs experimental wind-tunnel data is presented. The program was written for use primarily on the Langley CDC computer and CALCOMP plotters. The fundamental operating features of the program are that the input data are read and written to a random-access file for use during program execution, that the data for a selected run can be sorted and edited to delete duplicate points, and that the data can be plotted and faired using tension splines, least-squares polynomial, or least-squares cubic-spline curves. The most noteworthy feature of the program is the simplicity of the user-supplied input requirements. Several subroutines are also included that can be used to draw grid lines, zero lines, axis scale values and lables, and legends. A detailed description of the program operational features and each sub-program are presented. The general application of the program is also discussed together with the input and output for two typical plot types. A listing of the program code, user-guide, and output description are presented in appendices. The program has been in use at Langley for several years and has proven to be both easy to use and versatile.

  8. Multichannel seismic-reflection data collected in 1980 in the eastern Chukchi Sea

    USGS Publications Warehouse

    Grantz, Arthur; Mann, Dennis M.; May, Steven D.

    1986-01-01

    The U.S. Geological Survey (USGS) collected approximately 2,652 km of 24-channel seismic-reflection data in early September, 1980, over the continental shelf in the eastern Chukchi Sea (Fig. 1). The profiles were collected on the USGS Research Vessel S.P. Lee. The seismic energy source consisted of a tuned array of five airguns with a total volume of 1213 cubic inches of air compressed to approximately 1900 psi. The recording system consisted of a 24-channel, 2400 meter long streamer with a group interval of 100 m, and a GUS (Global Universal Science) model 4200 digital recording instrument. Shots were fired every 50 meters. Navigational control for the survey was provided by a Magnavox integrated navigation system using transit satellites and doppler-sonar augmented by Loran C (Rho-Rho). A 2-millisecond sampling rate was used in the field; the data were later desampled to 4-milliseconds during the demultiplexing process. 8 seconds data length was recorded. Processing was done at the USGS Pacific Marine Geology Multichannel Processing Center in Menlo Park, California, in the sequence: editing-demultiplexing, velocity analysis, CDP stacking, deconvolution-filtering, and plotting on an electrostatic plotter. Plate 1 is a trackline chart showing shotpoint navigation.

  9. Operational experience in underwater photogrammetry

    NASA Astrophysics Data System (ADS)

    Leatherdale, John D.; John Turner, D.

    Underwater photogrammetry has become established as a cost-effective technique for inspection and maintenance of platforms and pipelines for the offshore oil industry. A commercial service based in Scotland operates in the North Sea, USA, Brazil, West Africa and Australia. 70 mm cameras and flash units are built for the purpose and analytical plotters and computer graphics systems are used for photogrammetric measurement and analysis of damage, corrosion, weld failures and redesign of underwater structures. Users are seeking simple, low-cost systems for photogrammetric analysis which their engineers can use themselves.

  10. Quadrifilar Helical Antenna Array for Line-of-Sight Communications Above the Ocean Surface

    DTIC Science & Technology

    2007-06-25

    placing the copper-covered sheet into a mechanical plotter and using a diamond scribe to cut the edges. 5 27 (a) i (bI 900 PUTTR 180 SPTTER ANTENN 11Z...soldering of the cable to the hole and to avoid any possible radio frequency (RF) ground loops that may form. However, because it was determined that...prevent any RF ground loops that may be produced that could induce undesirable currents along the brass tube. Figure 4-9 is a closeup view of an

  11. The Shock and Vibration Bulletin: Proceedings on the Symposium on ShocK and Vibration (52nd) Held in New Orleans, Louisiana on 26-28 October 1981. Part 3. Environmental Testing and Simulation, Flight Environments.

    DTIC Science & Technology

    1982-05-01

    signal generation history can then be generated. These opera- processes generally consisted of recording tions work quite well electromagnetic ex...fninduerointe gualinicharactestics I PLOTTER MULTI CHANNEL TAPE RECORDER TEST ITEM 71T RESPONSE MOTIO ANALOG SIGNAL SHOCK CONDITIONING SPECTRUM ANALYZER...of TM to the EM. The exciter displacement producing a drive signal with excessive actua- drive signal is generated fron the linear sm tor stroke

  12. Differentially expressed and survival-related proteins of lung adenocarcinoma with bone metastasis.

    PubMed

    Yang, Mengdi; Sun, Yi; Sun, Jing; Wang, Zhiyu; Zhou, Yiyi; Yao, Guangyu; Gu, Yifeng; Zhang, Huizhen; Zhao, Hui

    2018-04-01

    Despite recent advances in targeted and immune-based therapies, the poor prognosis of lung adenocarcinoma (LUAD) with bone metastasis (BM) remains a challenge. First, two-dimensional gel electrophoresis (2-DE) was used to identify proteins that were differentially expressed in LUAD with BM, and then matrix-assisted laser desorption/ionization time of flight mass spectrometry (MALDI-TOF-MS) was used to identify these proteins. Second, the Cancer Genome Atlas (TCGA) was used to identify mutations in these differentially expressed proteins and Kaplan-Meier plotter (KM Plotter) was used to generate survival curves for the analyzed cases. Immunohistochemistry (IHC) was used to check the expression of proteins in 28 patients with BM and nine patients with LUAD. Lastly, the results were analyzed with respect to clinical features and patient's follow-up. We identified a number of matched proteins from 2-DE. High expression of enolase 1 (ENO1) (HR = 1.67, logrank P = 1.9E-05), ribosomal protein lateral stalk subunit P2 (RPLP2) (HR = 1.77, logrank P = 2.9e-06), and NME/NM23 nucleoside diphosphate kinase 2 (NME1-NME2) (HR = 2.65, logrank P = 3.9E-15) was all significantly associated with poor survival (P < 0.05). Further, ENO1 was upregulated (P = 0.0004) and calcyphosine (CAPS1) was downregulated (P = 5.34E-07) in TCGA LUAD RNA-seq expression data. IHC revealed that prominent ENO1 staining (OR = 7.5, P = 0.034) and low levels of CAPS1 (OR = 0.01, P < 0.0001) staining were associated with BM incidence. Finally, we found that LUAD patients with high expression of ENO1 and RPLP2 had worse overall survival. This is the first instance where the genes ENO1, RPLP2, NME1-NME2 and CAPS1 were associated with disease severity and progression in LUAD patients with BM. Thus, with this study, we have identified potential biomarkers and therapeutic targets for this disease. © 2018 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  13. [Experiences with an anesthesia protocol written by computer].

    PubMed

    Karliczek, G F; Brenken, U; van den Broeke, J J; Mooi, B; de Geus, A F; Wiersma, G; Oosterhaven, S

    1988-04-01

    Since December 1983, we have used a computer system for charting and data logging in cardiac and thoracic anesthesia. These computers, designed as stand-alone units, were developed at our hospital based on Motorola 6809 microprocessor systems. All measurements derived from anesthetic monitoring, ventilator, and heart-lung machine are automatically sampled at regular intervals and stored for later data management. Laboratory results are automatically received from the hospital computer system. The user communicates with the system via a terminal and a keyboard; this also facilitates the entering of all comments, medications, infusions, and fluid losses. All data are continuously displayed on an A3 format anesthetic chart using a multi-pen, flat-bed plotter. The operation of the system has proved to be simple and needs less time than charting by hand, while the result, the display on the chart, is far clearer and more complete than any handwritten document. Up to now 3,200 operations (corresponding to 12,500 anesthetic h) have been documented. The failure rate of the system, defined as an interruption of the documentation for more than 30 min is 2.1%. Further development of the system is discussed. A data base for processing the stored data has been developed and is being tested at present.

  14. Autocorrelation techniques for soft photogrammetry

    NASA Astrophysics Data System (ADS)

    Yao, Wu

    In this thesis research is carried out on image processing, image matching searching strategies, feature type and image matching, and optimal window size in image matching. To make comparisons, the soft photogrammetry package SoftPlotter is used. Two aerial photographs from the Iowa State University campus high flight 94 are scanned into digital format. In order to create a stereo model from them, interior orientation, single photograph rectification and stereo rectification are done. Two new image matching methods, multi-method image matching (MMIM) and unsquare window image matching are developed and compared. MMIM is used to determine the optimal window size in image matching. Twenty four check points from four different types of ground features are used for checking the results from image matching. Comparison between these four types of ground feature shows that the methods developed here improve the speed and the precision of image matching. A process called direct transformation is described and compared with the multiple steps in image processing. The results from image processing are consistent with those from SoftPlotter. A modified LAN image header is developed and used to store the information about the stereo model and image matching. A comparison is also made between cross correlation image matching (CCIM), least difference image matching (LDIM) and least square image matching (LSIM). The quality of image matching in relation to ground features are compared using two methods developed in this study, the coefficient surface for CCIM and the difference surface for LDIM. To reduce the amount of computation in image matching, the best-track searching algorithm, developed in this research, is used instead of the whole range searching algorithm.

  15. Van Allen Probes Science Gateway: A Centralized Data Access Point

    NASA Astrophysics Data System (ADS)

    Romeo, G.; Barnes, R. J.; Ukhorskiy, A. Y.; Sotirelis, T.; Stephens, G. K.; Kessel, R.; Potter, M.

    2015-12-01

    The Van Allen Probes Science Gateway acts a centralized interface to the instrument Science Operation Centers (SOCs), provides mission planning tools, and hosts a number of science related activities such as the mission bibliography. Most importantly, the Gateway acts as the primary site for processing and delivering the Van Allen Probes Space Weather data to users. Over the past years, the web-site has been completely redesigned with the focus on easier navigation and improvements of the existing tools such as the orbit plotter, position calculator and magnetic footprint tool. In addition, a new data plotting facility has been added. Based on HTML5, which allows users to interactively plot Van Allen Probes science and space weather data. The user can tailor the tool to display exactly the plot they wish to see and then share this with other users via either a URL or by QR code. Various types of plots can be created, including, simple time series, data plotted as a function of orbital location, and time versus L-Shell, capability of visualizing data from both probes (A & B) on the same plot. In cooperation with all Van Allen Probes Instrument SOCs, the Science Gateway will soon be able to serve higher level data products (Level 3), and to visualize them via the above mentioned HTML5 interface. Users will also be able to create customized CDF files on the fly.

  16. HYDES: A generalized hybrid computer program for studying turbojet or turbofan engine dynamics

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.

    1974-01-01

    This report describes HYDES, a hybrid computer program capable of simulating one-spool turbojet, two-spool turbojet, or two-spool turbofan engine dynamics. HYDES is also capable of simulating two- or three-stream turbofans with or without mixing of the exhaust streams. The program is intended to reduce the time required for implementing dynamic engine simulations. HYDES was developed for running on the Lewis Research Center's Electronic Associates (EAI) 690 Hybrid Computing System and satisfies the 16384-word core-size and hybrid-interface limits of that machine. The program could be modified for running on other computing systems. The use of HYDES to simulate a single-spool turbojet and a two-spool, two-stream turbofan engine is demonstrated. The form of the required input data is shown and samples of output listings (teletype) and transient plots (x-y plotter) are provided. HYDES is shown to be capable of performing both steady-state design and off-design analyses and transient analyses.

  17. A study of real-time computer graphic display technology for aeronautical applications

    NASA Technical Reports Server (NTRS)

    Rajala, S. A.

    1981-01-01

    The development, simulation, and testing of an algorithm for anti-aliasing vector drawings is discussed. The pseudo anti-aliasing line drawing algorithm is an extension to Bresenham's algorithm for computer control of a digital plotter. The algorithm produces a series of overlapping line segments where the display intensity shifts from one segment to the other in this overlap (transition region). In this algorithm the length of the overlap and the intensity shift are essentially constants because the transition region is an aid to the eye in integrating the segments into a single smooth line.

  18. Extension of a simplified computer program for analysis of solid-propellant rocket motors

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.

    1973-01-01

    A research project to develop a computer program for the preliminary design and performance analysis of solid propellant rocket engines is discussed. The following capabilities are included as computer program options: (1) treatment of wagon wheel cross sectional propellant configurations alone or in combination with circular perforated grains, (2) calculation of ignition transients with the igniter treated as a small rocket engine, (3) representation of spherical circular perforated grain ends as an alternative to the conical end surface approximation used in the original program, and (4) graphical presentation of program results using a digital plotter.

  19. Biomedical microfluidic devices by using low-cost fabrication techniques: A review.

    PubMed

    Faustino, Vera; Catarino, Susana O; Lima, Rui; Minas, Graça

    2016-07-26

    One of the most popular methods to fabricate biomedical microfluidic devices is by using a soft-lithography technique. However, the fabrication of the moulds to produce microfluidic devices, such as SU-8 moulds, usually requires a cleanroom environment that can be quite costly. Therefore, many efforts have been made to develop low-cost alternatives for the fabrication of microstructures, avoiding the use of cleanroom facilities. Recently, low-cost techniques without cleanroom facilities that feature aspect ratios more than 20, for fabricating those SU-8 moulds have been gaining popularity among biomedical research community. In those techniques, Ultraviolet (UV) exposure equipment, commonly used in the Printed Circuit Board (PCB) industry, replaces the more expensive and less available Mask Aligner that has been used in the last 15 years for SU-8 patterning. Alternatively, non-lithographic low-cost techniques, due to their ability for large-scale production, have increased the interest of the industrial and research community to develop simple, rapid and low-cost microfluidic structures. These alternative techniques include Print and Peel methods (PAP), laserjet, solid ink, cutting plotters or micromilling, that use equipment available in almost all laboratories and offices. An example is the xurography technique that uses a cutting plotter machine and adhesive vinyl films to generate the master moulds to fabricate microfluidic channels. In this review, we present a selection of the most recent lithographic and non-lithographic low-cost techniques to fabricate microfluidic structures, focused on the features and limitations of each technique. Only microfabrication methods that do not require the use of cleanrooms are considered. Additionally, potential applications of these microfluidic devices in biomedical engineering are presented with some illustrative examples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. LION4; LION; three-dimensional temperature distribution program. [CDC6600,7600; UNIVAC1108; IBM360,370; FORTRAN IV and ASCENT (CDC6600,7600), FORTRAN IV (UNIVAC1108A,B and IBM360,370)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binney, E.J.

    LION4 is a computer program for calculating one-, two-, or three-dimensional transient and steady-state temperature distributions in reactor and reactor plant components. It is used primarily for thermal-structural analyses. It utilizes finite difference techniques with first-order forward difference integration and is capable of handling a wide variety of bounding conditions. Heat transfer situations accommodated include forced and free convection in both reduced and fully-automated temperature dependent forms, coolant flow effects, a limited thermal radiation capability, a stationary or stagnant fluid gap, a dual dependency (temperature difference and temperature level) heat transfer, an alternative heat transfer mode comparison and selection facilitymore » combined with heat flux direction sensor, and any form of time-dependent boundary temperatures. The program, which handles time and space dependent internal heat generation, can also provide temperature dependent material properties with limited non-isotropic properties. User-oriented capabilities available include temperature means with various weightings and a complete heat flow rate surveillance system.CDC6600,7600;UNIVAC1108;IBM360,370; FORTRAN IV and ASCENT (CDC6600,7600), FORTRAN IV (UNIVAC1108A,B and IBM360,370); SCOPE (CDC6600,7600), EXEC8 (UNIVAC1108A,B), OS/360,370 (IBM360,370); The CDC6600 version plotter routine LAPL4 is used to produce the input required by the associated CalComp plotter for graphical output. The IBM360 version requires 350K for execution and one additional input/output unit besides the standard units.« less

  1. An interactive program to display user-generated or file-based maps on a personal computer monitor

    USGS Publications Warehouse

    Langer, W.H.; Stephens, R.W.

    1987-01-01

    PC MAP-MAKER is an ADVANCED BASIC program written to provide users of IBM XT, IBM AT, and compatible computers with a straight-forward, flexible method to display geographical data on a color or monochrome PC (personal computer) monitor. Data can be political boundaries such as State and county boundaries; natural curvilinear features such as rivers, drainage areas, and geological contacts; and points such as well locations and mineral localities. Essentially any point defined by a latitude and longitude and any line defined by a series of latitude and longitude values can be displayed using the program. PC MAP MAKER allows users to view tabular data from U.S. Geological Survey files such as WATSTORE (National Water Data Storage and Retrieval System) in a map format in a time much shorter than required by sending the data to a line plotter. The screen image can be saved to disk for recall at a later date, and hard copies can be printed with a dot matrix printer. The program is user-friendly, using menus or prompts to guide user input. It is fully documented and structured to allow the user to tailor the program to the user 's specific needs. The documentation includes a tutorial designed to introduce users to the capabilities of the program using the State of Colorado as a demonstration map area. (Author 's abstract)

  2. DIRT: The Dust InfraRed Toolbox

    NASA Astrophysics Data System (ADS)

    Pound, M. W.; Wolfire, M. G.; Mundy, L. G.; Teuben, P. J.; Lord, S.

    We present DIRT, a Java applet geared toward modeling a variety of processes in envelopes of young and evolved stars. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. The computing cluster for the database is described in the accompanying paper by Teuben et al. (2000). A typical user query will return about 50-100 models, which the user can then interactively filter as a function of 8 model parameters (e.g., extinction, size, flux, luminosity). A flexible, multi-dimensional plotter (Figure 1) allows users to view the models, rotate them, tag specific parameters with color or symbol size, and probe individual model points. For any given model, auxiliary plots such as dust grain properties, radial intensity profiles, and the flux as a function of wavelength and beamsize can be viewed. The user can fit observed data to several models simultaneously and see the results of the fit; the best fit is automatically selected for plotting. The URL for this project is http://dustem.astro.umd.edu.

  3. Engineering studies of vectorcardiographs in blood pressure measuring systems, appendix 1

    NASA Technical Reports Server (NTRS)

    Mark, R. G.

    1975-01-01

    A small, portable, relatively inexpensive computer system was developed for on-line use in clinical or laboratory situations. The system features an integrated hardware-software package that permits use of all peripherals, such as analog-to-digital converter, oscilloscope, plotter, digital bus, with an interpreter constructed around the BASIC programming language. The system is conceptually similar to the LINC system developed in 1962, but is more compact and powerful due to intervening advances in integrated circuit technology. A description of the hardware of the system was given. A reference manual, user manual, and programming guides were also presented. Finally, a stereo display system for vectorcardiograms was described.

  4. A high precision ultrasonic system for vibration measurements

    NASA Astrophysics Data System (ADS)

    Young, M. S.; Li, Y. C.

    1992-11-01

    A microcomputer-aided ultrasonic system that can be used to measure the vibratory displacements of an object is presented. A pair of low cost 40-kHz ultrasonic transducers is used to transmit ultrasound toward an object and receive the ultrasound reflected from the object. The relative motion of the object modulates the phase angle difference between the transmitted and received ultrasound signals. A single-chip microcomputer-based phase detector was designed to record and analyze the phase shift information which is then sent to a PC-AT microcomputer for processing. We have developed an ingenious method to reconstruct the relative motion of an object from the acquired data of the phase difference changes. A digital plotter based experiment was also designed for testing the performance of the whole system. The measured accuracy of the system in the reported experiments is within +/- 0.4 mm and the theoretical maximal measurable speed of the object is 89.6 cm/s. The main advantages of this ultrasonic vibration measurement system are high resolution, low cost, noncontact measurement, and easy installation.

  5. The Mars Analysis Correction Data Assimilation (MACDA): A reference atmospheric reanalysis

    NASA Astrophysics Data System (ADS)

    Montabone, Luca; Read, Peter; Lewis, Stephen; Steele, Liam; Holmes, James; Valeanu, Alexandru

    2016-07-01

    The Mars Analysis Correction Data Assimilation (MACDA) dataset version 1.0 contains the reanalysis of fundamental atmospheric and surface variables for the planet Mars covering a period of about three Martian years (late MY 24 to early MY 27). This has been produced by data assimilation of retrieved thermal profiles and column dust optical depths from NASA's Mars Global Surveyor/Thermal Emission Spectrometer (MGS/TES), which have been assimilated into a Mars global climate model (MGCM) using the Analysis Correction scheme developed at the UK Meteorological Office. The MACDA v1.0 reanalysis is publicly available, and the NetCDF files can be downloaded from the archive at the Centre for Environmental Data Analysis/British Atmospheric Data Centre (CEDA/BADC). The variables included in the dataset can be visualised using an ad-hoc graphical user interface (the "MACDA Plotter") at the following URL: http://macdap.physics.ox.ac.uk/ MACDA is an ongoing collaborative project, and work is currently undertaken to produce version 2.0 of the Mars atmospheric reanalysis. One of the key improvements is the extension of the reanalysis period to nine martian years (MY 24 through MY 32), with the assimilation of NASA's Mars Reconnaissance Orbiter/Mars Climate Sounder (MRO/MCS) retrievals of thermal and dust opacity profiles. MACDA 2.0 is also going to be based on an improved version of the underlying MGCM and an updated scheme to fully assimilate (radiative active) tracers, such as dust and water ice.

  6. GASPLOT - A computer graphics program that draws a variety of thermophysical property charts

    NASA Technical Reports Server (NTRS)

    Trivisonno, R. J.; Hendricks, R. C.

    1977-01-01

    A FORTRAN V computer program, written for the UNIVAC 1100 series, is used to draw a variety of precision thermophysical property charts on the Calcomp plotter. In addition to the program (GASPLOT), which requires (15 160) sub 10 storages, a thermophysical properties routine needed to produce plots. The program is designed so that any two of the state variables, the derived variables, or the transport variables may be plotted as the ordinate - abscissa pair with as many as five parametric variables. The parameters may be temperature, pressure, density, enthalpy, and entropy. Each parameter may have as many a 49 values, and the range of the variables is limited only by the thermophysical properties routine.

  7. Computer package for the design and optimization of absorption air conditioning system operated by solar energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sofrata, H.; Khoshaim, B.; Megahed, M.

    1980-12-01

    In this paper a computer package for the design and optimization of the simple Li-Br absorption air conditioning system, operated by solar energy, is developed in order to study its performance. This was necessary, as a first step, before carrying out any computations regarding the dual system (1-3). The computer package has the facilities of examining any parameter which may control the system; namely generator, evaporator, condenser, absorber temperatures and pumping factor. The output may be tabulated and also fed to the graph plotter. The flow chart of the programme is explained in an easy way and a typical examplemore » is included.« less

  8. Nova Centauri 2013 = PNV J13544700-5909080

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2013-12-01

    Announces the discovery of V1369 Cen = Nova Cen 2013 = PNV J13544700-5909080 by John Seach (Chatsworth Island, NSW, Australia) at unfiltered magnitude 5.5 on 2013 December 02.692 UT. Low-resolution spectra obtained by Locke on Dec. 03.3776 UT and by Kaufman on Dec. 03.621 UT show strong Ha and Hb emission lines, indicating the object is a nova. Announced on IAU CBAT Central Bureau Electronic Telegram 3732 (Daniel W. E. Green, ed.). Finder charts with sequences may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and observations.

  9. Modern Processing Capabilities of Analog Data from Documentation of the Great Omayyad Mosque in Aleppo, Syria, Damaged in Civil War

    NASA Astrophysics Data System (ADS)

    Pavelka, K.; Šedina, J.; Raeva, P.; Hůlková, M.

    2017-08-01

    In 1999, a big project for the documentation of the Great Omayyad mosque in Aleppo / Syria under UNESCO was conducted. By end of the last century, still analogue cameras were still being used, like the UMK Zeiss, RolleiMetric System. Digital cameras and digital automatic data processing were just starting to be on the rise and laser scanning was not relevant. In this situation, photogrammetrical measurement used stereo technology for complicated situations, and object and single-image technology for creating photoplans. Hundreds of photogrammetric images were taken. However, data processing was carried out on digital stereo plotters or workstations; it was necessary that all analogue photos were converted to digital form using a photogrammetric scanner. The outputs were adequate to the end of the last century. Nowadays, after 19 years, the photogrammetric materials still exist, but the technology and processing is completely different. Our original measurement is historical and nowadays quite obsolete. So we was it decided to explore the possibilities of the new processing of historical materials. Why? The reason is that in the last few years there has been civil war in Syria and the above mentioned monument was severely damaged. The existing historical materials therefore provide a unique opportunity for possible future reconstruction. This paper refers to the completion of existing materials, their evaluation and possibilities of new processing with today's technologies.

  10. SN 2017eaw in NGC 6946 (PSN J20344424+6011359)

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-05-01

    AAVSO Alert Notice 577 announces and reports on the discovery of SN 2017eaw (PSN J20344424+6011359) in NGC 6946 by Patrick Wiggins (Tooele, UT) at unfiltered CCD magnitude 12.8 on 2017 May 14.2383 UT. Spectra indicating the object is a pre-maximum Type II supernova was reported by Y.-C. Cheng et al. (ATel #10374), by D. Xiang et al. (ATel #10376), and by L. Tomasella et al. on behalf of the NUTS collaboration (ATel #10377). Information on the probable progenitor red supergiant was given by R. Khan (ATel #10373), and by S. Van Dyk et al., who,using HST ACS/WFC archival data, reported the probable progenitor to be a red supergiant located approximately 6.4 WFC pixels (about 0.3 arcsec) to the southwest of the discovery position (ATel #10378). A. Kong and K. Li reported that Chandra archival X-ray data (2001-2012) show no X-ray emission correlated with the supernova position; Swift TOO observations from May 14 (two observations 8 hours apart) show X-ray emission present and increasing (ATel #10380). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  11. Database-driven web interface automating gyrokinetic simulations for validation

    NASA Astrophysics Data System (ADS)

    Ernst, D. R.

    2010-11-01

    We are developing a web interface to connect plasma microturbulence simulation codes with experimental data. The website automates the preparation of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data from TRANSP analysis of experiments, read from MDSPLUS over the internet. This database-driven tool saves user sessions, allowing searches of previous simulations, which can be restored to repeat the same analysis for a new discharge. The website includes a multi-tab, multi-frame, publication quality java plotter Webgraph, developed as part of this project. Input files can be uploaded as templates and edited with context-sensitive help. The website creates inputs for GS2 and GYRO using a well-tested and verified back-end, in use for several years for the GS2 code [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)]. A centralized web site has the advantage that users receive bug fixes instantaneously, while avoiding the duplicated effort of local compilations. Possible extensions to the database to manage run outputs, toward prototyping for the Fusion Simulation Project, are envisioned. Much of the web development utilized support from the DoE National Undergraduate Fellowship program [e.g., A. Suarez and D. R. Ernst, http://meetings.aps.org/link/BAPS.2005.DPP.GP1.57.

  12. LONGLIB - A GRAPHICS LIBRARY

    NASA Technical Reports Server (NTRS)

    Long, D.

    1994-01-01

    This library is a set of subroutines designed for vector plotting to CRT's, plotters, dot matrix, and laser printers. LONGLIB subroutines are invoked by program calls similar to standard CALCOMP routines. In addition to the basic plotting routines, LONGLIB contains an extensive set of routines to allow viewport clipping, extended character sets, graphic input, shading, polar plots, and 3-D plotting with or without hidden line removal. LONGLIB capabilities include surface plots, contours, histograms, logarithm axes, world maps, and seismic plots. LONGLIB includes master subroutines, which are self-contained series of commonly used individual subroutines. When invoked, the master routine will initialize the plotting package, and will plot multiple curves, scatter plots, log plots, 3-D plots, etc. and then close the plot package, all with a single call. Supported devices include VT100 equipped with Selanar GR100 or GR100+ boards, VT125s, VT240s, VT220 equipped with Selanar SG220, Tektronix 4010/4014 or 4107/4109 and compatibles, and Graphon GO-235 terminals. Dot matrix printer output is available by using the provided raster scan conversion routines for DEC LA50, Printronix printers, and high or low resolution Trilog printers. Other output devices include QMS laser printers, Postscript compatible laser printers, and HPGL compatible plotters. The LONGLIB package includes the graphics library source code, an on-line help library, scan converter and meta file conversion programs, and command files for installing, creating, and testing the library. The latest version, 5.0, is significantly enhanced and has been made more portable. Also, the new version's meta file format has been changed and is incompatible with previous versions. A conversion utility is included to port the old meta files to the new format. Color terminal plotting has been incorporated. LONGLIB is written in FORTRAN 77 for batch or interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985, and last updated in 1988.

  13. Vasodilator-Stimulated Phosphoprotein (VASP) depletion from breast cancer MDA-MB-231 cells inhibits tumor spheroid invasion through downregulation of Migfilin, β-catenin and urokinase-plasminogen activator (uPA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gkretsi, Vasiliki; Stylianou, Andreas; Stylianopoulos, Triantafyllos, E-mail: tstylian@ucy.ac.cy

    A hallmark of cancer cells is their ability to invade surrounding tissues and form metastases. Cell-extracellular matrix (ECM)-adhesion proteins are crucial in metastasis, connecting tumor ECM with actin cytoskeleton thus enabling cells to respond to mechanical cues. Vasodilator-stimulated phosphoprotein (VASP) is an actin-polymerization regulator which interacts with cell-ECM adhesion protein Migfilin, and regulates cell migration. We compared VASP expression in MCF-7 and MDA-MB-231 breast cancer (BC) cells and found that more invasive MDA-MB-231 cells overexpress VASP. We then utilized a 3-dimensional (3D) approach to study metastasis in MDA-MB-231 cells using a system that considers mechanical forces exerted by the ECM.more » We prepared 3D collagen I gels of increasing concentration, imaged them by atomic force microscopy, and used them to either embed cells or tumor spheroids, in the presence or absence of VASP. We show, for the first time, that VASP silencing downregulated Migfilin, β-catenin and urokinase plasminogen activator both in 2D and 3D, suggesting a matrix-independent mechanism. Tumor spheroids lacking VASP demonstrated impaired invasion, indicating VASP’s involvement in metastasis, which was corroborated by Kaplan-Meier plotter showing high VASP expression to be associated with poor remission-free survival in lymph node-positive BC patients. Hence, VASP may be a novel BC metastasis biomarker. - Highlights: • More invasive MDA-MB-231 overexpress VASP compared to MCF-7 breast cancer cells. • We prepared 3D collagen I gels of increasing concentration and characterized them. • VASP silencing downregulated Migfilin, β-catenin and uPA both in 2D and 3D culture. • Tumor spheroids lacking VASP demonstrated impaired invasion. • Kaplan-Meier plotter shows association of high VASP expression with poor survival.« less

  14. A graphics-oriented personal computer-based microscope charting system for neuroanatomical and neurochemical studies.

    PubMed

    Tourtellotte, W G; Lawrence, D T; Getting, P A; Van Hoesen, G W

    1989-07-01

    This report describes a computerized microscope charting system based on the IBM personal computer or compatible. Stepping motors are used to control the movement of the microscope stage and to encode its position by hand manipulation of a joystick. Tissue section contours and the location of cells labeled with various compounds are stored by the computer, plotted at any magnification and manipulated into composites created from several charted sections. The system has many advantages: (1) it is based on an industry standardized computer that is affordable and familiar; (2) compact and commercially available stepping motor microprocessors control the stage movement. These controllers increase reliability, simplify implementation, and increase efficiency by relieving the computer of time consuming control tasks; (3) the system has an interactive graphics interface allowing the operator to view the image during data collection. Regions of the graphics display can be enlarged during the charting process to provide higher resolution and increased accuracy; (4) finally, the digitized data are stored at 0.5 micron resolution and can be routed directly to a multi-pen plotter or exported to a computer-aided design (CAD) program to generate a publication-quality montage composed of several computerized chartings. The system provides a useful tool for the acquisition and qualitative analysis of data representing stained cells or chemical markers in tissue. The modular design, together with data storage at high resolution, allows for potential analytical enhancements involving planimetric, stereologic and 3-D serial section reconstruction.

  15. Writing filter processes for the SAGA editor, appendix G

    NASA Technical Reports Server (NTRS)

    Kirslis, Peter A.

    1985-01-01

    The SAGA editor provides a mechanism by which separate processes can be invoked during an editing session to traverse portions of the parse tree being edited. These processes, termed filter processes, read, analyze, and possibly transform the parse tree, returning the result to the editor. By defining new commands with the editor's user defined command facility, which invoke filter processes, authors of filter can provide complex operations as simple commands. A tree plotter, pretty printer, and Pascal tree transformation program were already written using this facility. The filter processes are introduced, parse tree structure is described and the library interface made available to the programmer. Also discussed is how to compile and run filter processes. Examples are presented to illustrate aspect of each of these areas.

  16. Observations of V694 Mon (MWC 560) requested for Chandra campaign

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-02-01

    Dr. Jeno Sokoloski (Columbia University) and Mr. Adrian Lucy (graduate student, Columbia University) have requested AAVSO observations of the jet-driving symbiotic star V694 Mon (MWC 560), which is in outburst, in support of upcoming Chandra observations to investigate the state of the inner accretion disk during this outburst. Beginning now and continuing through April 2016, Sokoloski writes, "multi-band photometry (UBVRI, but especially UBV), spectroscopy, and minute-time-resolution light curves of the optical flickering are requested. Series of exposures in B or V will be very interesting." Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  17. The effect of support flexibility and damping on the dynamic response of a single mass flexible rotor in elastic bearings

    NASA Technical Reports Server (NTRS)

    Kirk, R. G.; Gunter, E. J.

    1972-01-01

    The dynamic unabalance response and transient motion of the single mass Jeffcott rotor in elastic bearings mounted on damped, flexible supports are discussed. A steady state analysis of the shaft and the bearing housing motion was made by assuming synchronous precession of the system. The conditions under which the support system would act as a dynamic vibration absorber at the rotor critical speed were studied. Plots of the rotor and support amplitudes, phase angles, and forces transmitted were evaluated by the computer and the performance curves were plotted by an automatic plotter unit. Curves are presented on the optimization of the support housing characteristics of attenuate the rotor synchronous unbalance response.

  18. LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience

    NASA Astrophysics Data System (ADS)

    Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.

    2016-12-01

    CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)

  19. WORM - WINDOWED OBSERVATION OF RELATIVE MOTION

    NASA Technical Reports Server (NTRS)

    Bauer, F.

    1994-01-01

    The Windowed Observation of Relative Motion, WORM, program is primarily intended for the generation of simple X-Y plots from data created by other programs. It allows the user to label, zoom, and change the scale of various plots. Three dimensional contour and line plots are provided, although with more limited capabilities. The input data can be in binary or ASCII format, although all data must be in the same format. A great deal of control over the details of the plot is provided, such as gridding, size of tick marks, colors, log/semilog capability, time tagging, and multiple and phase plane plots. Many color and monochrome graphics terminals and hard copy printer/plotters are supported. The WORM executive commands, menu selections and macro files can be used to develop plots and tabular data, query the WORM Help library, retrieve data from input files, and invoke VAX DCL commands. WORM generated plots are displayed on local graphics terminals and can be copied using standard hard copy capabilities. Some of the graphics features of WORM include: zooming and dezooming various portions of the plot; plot documentation including curve labeling and function listing; multiple curves on the same plot; windowing of multiple plots and insets of the same plot; displaying a specific on a curve; and spinning the curve left, right, up, and down. WORM is written in PASCAL for interactive execution and has been implemented on a DEC VAX computer operating under VMS 4.7 with a virtual memory requirement of approximately 392K of 8 bit bytes. It uses the QPLOT device independent graphics library included with WORM. It was developed in 1988.

  20. Rework of the ERA software system: ERA-8

    NASA Astrophysics Data System (ADS)

    Pavlov, D.; Skripnichenko, V.

    2015-08-01

    The software system that has been powering many products of the IAA during decades has undergone a major rework. ERA has capabilities for: processing tables of observations of different kinds, fitting parameters to observations, integrating equations of motion of the Solar system bodies. ERA comprises a domain-specific language called SLON, tailored for astronomical tasks. SLON provides a convenient syntax for reductions of observations, choosing of IAU standards to use, applying rules for filtering observations or selecting parameters for fitting. Also, ERA includes a table editor and a graph plotter. ERA-8 has a number of improvements over previous versions such as: integration of the Solar system and TT xA1 TDB with arbitrary number of asteroids; option to use different ephemeris (including DE and INPOP); integrator with 80-bit floating point. The code of ERA-8 has been completely rewritten from Pascal to C (for numerical computations) and Racket (for running SLON programs and managing data). ERA-8 is portable across major operating systems. The format of tables in ERA-8 is based on SQLite. The SPICE format has been chosen as the main format for ephemeris in ERA-8.

  1. C-MOS bulk metal design handbook. [LSI standard cell (circuits)

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1977-01-01

    The LSI standard cell array technique was used in the fabrication of more than 20 CMOS custom arrays. This technique consists of a series of computer programs and design automation techniques referred to as the Computer Aided Design And Test (CADAT) system that automatically translate a partitioned logic diagram into a set of instructions for driving an automatic plotter which generates precision mask artwork for complex LSI arrays of CMOS standard cells. The standard cell concept for producing LSI arrays begins with the design, layout, and validation of a group of custom circuits called standard cells. Once validated, these cells are given identification or pattern numbers and are permanently stored. To use one of these cells in a logic design, the user calls for the desired cell by pattern number. The Place, Route in Two Dimension (PR2D) computer program is then used to automatically generate the metalization and/or tunnels to interconnect the standard cells into the required function. Data sheets that describe the function, artwork, and performance of each of the standard cells, the general procedure for implementation of logic in CMOS standard cells, and additional detailed design information are presented.

  2. FORTRAN 4 computer program for calculating critical speeds of rotating shafts

    NASA Technical Reports Server (NTRS)

    Trivisonno, R. J.

    1973-01-01

    A FORTRAN 4 computer program, written for the IBM DCS 7094/7044 computer, that calculates the critical speeds of rotating shafts is described. The shaft may include bearings, couplings, extra masses (nonshaft mass), and disks for the gyroscopic effect. Shear deflection is also taken into account, and provision is made in the program for sections of the shaft that are tapered. The boundary conditions at the ends of the shaft can be fixed (deflection and slope equal to zero) or free (shear and moment equal to zero). The fixed end condition enables the program to calculate the natural frequencies of cantilever beams. Instead of using the lumped-parameter method, the program uses continuous integration of the differential equations of beam flexure across different shaft sections. The advantages of this method over the usual lumped-parameter method are less data preparation and better approximation of the distribution of the mass of the shaft. A main feature of the program is the nature of the output. The Calcomp plotter is used to produce a drawing of the shaft with superimposed deflection curves at the critical speeds, together with all pertinent information related to the shaft.

  3. FAST User Guide

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Clucas, Jean; McCabe, R. Kevin; Plessel, Todd; Potter, R.; Cooper, D. M. (Technical Monitor)

    1994-01-01

    The Flow Analysis Software Toolkit, FAST, is a software environment for visualizing data. FAST is a collection of separate programs (modules) that run simultaneously and allow the user to examine the results of numerical and experimental simulations. The user can load data files, perform calculations on the data, visualize the results of these calculations, construct scenes of 3D graphical objects, and plot, animate and record the scenes. Computational Fluid Dynamics (CFD) visualization is the primary intended use of FAST, but FAST can also assist in the analysis of other types of data. FAST combines the capabilities of such programs as PLOT3D, RIP, SURF, and GAS into one environment with modules that share data. Sharing data between modules eliminates the drudgery of transferring data between programs. All the modules in the FAST environment have a consistent, highly interactive graphical user interface. Most commands are entered by pointing and'clicking. The modular construction of FAST makes it flexible and extensible. The environment can be custom configured and new modules can be developed and added as needed. The following modules have been developed for FAST: VIEWER, FILE IO, CALCULATOR, SURFER, TOPOLOGY, PLOTTER, TITLER, TRACER, ARCGRAPH, GQ, SURFERU, SHOTET, and ISOLEVU. A utility is also included to make the inclusion of user defined modules in the FAST environment easy. The VIEWER module is the central control for the FAST environment. From VIEWER, the user can-change object attributes, interactively position objects in three-dimensional space, define and save scenes, create animations, spawn new FAST modules, add additional view windows, and save and execute command scripts. The FAST User Guide uses text and FAST MAPS (graphical representations of the entire user interface) to guide the user through the use of FAST. Chapters include: Maps, Overview, Tips, Getting Started Tutorial, a separate chapter for each module, file formats, and system administration.

  4. Assessment of Effectiveness of Geologic Isolation Systems. Variable thickness transient ground-water flow model. Volume 2. Users' manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisenauer, A.E.

    1979-12-01

    A system of computer codes to aid in the preparation and evaluation of ground-water model input, as well as in the computer codes and auxillary programs developed and adapted for use in modeling major ground-water aquifers is described. The ground-water model is interactive, rather than a batch-type model. Interactive models have been demonstrated to be superior to batch in the ground-water field. For example, looking through reams of numerical lists can be avoided with the much superior graphical output forms or summary type numerical output. The system of computer codes permits the flexibility to develop rapidly the model-required data filesmore » from engineering data and geologic maps, as well as efficiently manipulating the voluminous data generated. Central to these codes is the Ground-water Model, which given the boundary value problem, produces either the steady-state or transient time plane solutions. A sizeable part of the codes available provide rapid evaluation of the results. Besides contouring the new water potentials, the model allows graphical review of streamlines of flow, travel times, and detailed comparisons of surfaces or points at designated wells. Use of the graphics scopes provide immediate, but temporary displays which can be used for evaluation of input and output and which can be reproduced easily on hard copy devices, such as a line printer, Calcomp plotter and image photographs.« less

  5. Microcomputer aided tracking (MCAT)

    NASA Astrophysics Data System (ADS)

    Mays, A. B.; Cross, D. C.; Walters, J. L.

    1983-07-01

    The goal of the MCAT project was to investigate the effectiveness of operator initiated tracks followed by automatic tracking. Adding this capability to a display was intended to relieve operator overload and fatigue which results when the operator is limited to grease pencil tracking. MCAT combines several microprocessors and a microcomputer-driven PPI(Plan Position Indications) with graphics capability. The operator is required to make the initial detection and MCAT then performs automatic detection and tracking in a limited area centered around the detection. This approach was chosen because it is far less costly than a full-up auto detect and track approach. MCAT is intended for use in a non-NTDS (Naval Tactical Data System) environment where operator aids are minimal at best. There are approximately 200 non-NTDS ships in today's Navy. Each of these ships has a combat information center (CIC) which includes numerous PPIs typically SPA-25s, SPA-66s, SPA-50s) and various manual means (e.g., air summary plotboards, NC-2 plotters) of producing summary plots and performing calculations (e.g., maneuvering board paper) pertinent to tracks in progress. The operator's duties are time-consuming and there are many things that could be done via computer control and graphics displays that the non-NTDS operate must now do manually. Because there is much manual information handling, accumulation of data is slow and there is a large probability of error.

  6. US Topo: Topographic Maps for the Nation

    USGS Publications Warehouse

    Hytes, Patricia L.

    2009-01-01

    US Topo is the next generation of topographic maps from the U.S. Geological Survey (USGS). Arranged in the familiar 7.5-minute quadrangle format, digital US Topo maps are designed to look and feel (and perform) like the traditional paper topographic maps for which the USGS is so well known. In contrast to paper-based maps, US Topo maps provide modern technical advantages that support faster, wider public distribution and enable basic, on-screen geographic analysis for all users. US Topo maps are available free on the Web. Each map quadrangle is constructed in GeoPDF? format from key layers of geographic data (orthoimagery, roads, geographic names, topographic contours, and hydrographic features) found in The National Map. US Topo quadrangles can be printed from personal computers or plotters as complete, full-sized, maps or in customized sections, in a user-desired specific format. Paper copies of the maps can also be purchased from the USGS Store. Download links and a users guide are featured on the US Topo Web site. US Topo users can turn geographic data layers on and off as needed; they can zoom in and out to highlight specific features or see a broader area. File size for each digital 7.5-minute quadrangle, about 15-20 megabytes, is suitable for most users. Associated electronic tools for geographic analysis are available free for download.

  7. Laboratory manual: mineral X-ray diffraction data retrieval/plot computer program

    USGS Publications Warehouse

    Hauff, Phoebe L.; VanTrump, George

    1976-01-01

    The Mineral X-Ray Diffraction Data Retrieval/Plot Computer Program--XRDPLT (VanTrump and Hauff, 1976a) is used to retrieve and plot mineral X-ray diffraction data. The program operates on a file of mineral powder diffraction data (VanTrump and Hauff, 1976b) which contains two-theta or 'd' values, and intensities, chemical formula, mineral name, identification number, and mineral group code. XRDPLT is a machine-independent Fortran program which operates in time-sharing mode on a DEC System i0 computer and the Gerber plotter (Evenden, 1974). The program prompts the user to respond from a time-sharing terminal in a conversational format with the required input information. The program offers two major options: retrieval only; retrieval and plot. The first option retrieves mineral names, formulas, and groups from the file by identification number, by the mineral group code (a classification by chemistry or structure), or by searches based on the formula components. For example, it enables the user to search for minerals by major groups (i.e., feldspars, micas, amphiboles, oxides, phosphates, carbonates) by elemental composition (i.e., Fe, Cu, AI, Zn), or by a combination of these (i.e., all copper-bearing arsenates). The second option retrieves as the first, but also plots the retrieved 2-theta and intensity values as diagrammatic X-ray powder patterns on mylar sheets or overlays. These plots can be made using scale combinations compatible with chart recorder diffractograms and 114.59 mm powder camera films. The overlays are then used to separate or sieve out unrelated minerals until unknowns are matched and identified.

  8. Development of a High Strength Isothermally Heat-Treated Nodular Iron Road Wheel Arm

    DTIC Science & Technology

    1985-03-31

    capacity load cell was calibrated using a Satec Universal Test System and a Hewlett-Packard X,Y Plotter to record the calibrated curve. The load cell...12e 1,3- 0 s0 3 I I r~ I I Il.1660 119 13. 30 1 3,1ý4-514 1 1 o36 1~~ 123 1, d 51 7 4~ 14 ~ 3.• 3I 2k i 7, G 0 se, y 2 es I Q.~ 141/ 14( 13.0130 1B14...LOT BAR QCH YIELD TESS. ELON(. Rc Rc 0HNCARPY LENGTH CaVNT. Nio. No. TIME .000 1O..O % ýMa-crol~licrd IFt Lb INCH~ ES IOU 2-77 •,3~-341.4___ 1 79 7,o

  9. Transparent electrodes made with ultrasonic spray coating technique for flexible heaters

    NASA Astrophysics Data System (ADS)

    Wroblewski, G.; Krzemiński, J.; Janczak, D.; Sowiński, J.; Jakubowska, M.

    2017-08-01

    Transparent electrodes are one of the basic elements of various electronic components. The paper presents the preliminary results related to novel method of ultrasonic spray coating used for fabrication of transparent flexible electrodes. Experiments were conducted by means of specially made laboratory setup composed of ultrasonic spray generator and XYZ plotter. In the first part of the paper diverse solvents were used to determine the crucial technological parameters such as atomization voltage and fluid flow velocity. Afterwards paint containing carbon nanotubes suspended in the two solvent system was prepared and deposited on the polyethylene terephthalate foil. Thickness, roughness and electrical measurements were performed to designate the relations of technological parameters of ultrasonic spray coating on thickness, roughness, sheet resistance and optical transmission of fabricated samples.

  10. Water-based alkyl ketene dimer ink for user-friendly patterning in paper microfluidics.

    PubMed

    Hamidon, Nurul Nadiah; Hong, Yumiao; Salentijn, Gert Ij; Verpoorte, Elisabeth

    2018-02-13

    We propose the use of water-based alkyl ketene dimer (AKD) ink for fast and user-friendly patterning of paper microfluidic devices either manually or using an inexpensive XY-plotter. The ink was produced by dissolving hydrophobic AKD in chloroform and emulsifying the solution in water. The emulsification was performed in a warm water bath, which led to an increased rate of the evaporation of chloroform. Subsequent cooling led to the final product, an aqueous suspension of fine AKD particles. The effects of surfactant and AKD concentrations, emulsification procedure, and cooling approach on final ink properties are presented, along with an optimized protocol for its formulation. This hydrophobic agent was applied onto paper using a plotter pen, after which the paper was heated to allow spreading of AKD molecules and chemical bonding with cellulose. A paper surface patterned with the ink (10 g L -1 AKD) yielded a contact angle of 135.6° for water. Unlike organic solvent-based solutions of AKD, this AKD ink does not require a fume hood for its use. Moreover, it is compatible with plastic patterning tools, due to the effective removal of chloroform in the production process to less than 2% of the total volume. Furthermore, this water-based ink is easy to prepare and use. Finally, the AKD ink can also be used for the fabrication of so-called selectively permeable barriers for use in paper microfluidic networks. These are barriers that stop the flow of water through paper, but are permeable to solvents with lower surface energies. We applied the AKD ink to confine and preconcentrate sample on paper, and demonstrated the use of this approach to achieve higher detection sensitivities in paper spray ionization-mass spectrometry (PSI-MS). Our patterning approach can be employed outside of the analytical lab or machine workshop for fast prototyping and small-scale production of paper-based analytical tools, for use in limited-resource labs or in the field. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Automated preparation method for colloidal crystal arrays of monodisperse and binary colloid mixtures by contact printing with a pintool plotter.

    PubMed

    Burkert, Klaus; Neumann, Thomas; Wang, Jianjun; Jonas, Ulrich; Knoll, Wolfgang; Ottleben, Holger

    2007-03-13

    Photonic crystals and photonic band gap materials with periodic variation of the dielectric constant in the submicrometer range exhibit unique optical properties such as opalescence, optical stop bands, and photonic band gaps. As such, they represent attractive materials for the active elements in sensor arrays. Colloidal crystals, which are 3D gratings leading to Bragg diffraction, are one potential precursor of such optical materials. They have gained particular interest in many technological areas as a result of their specific properties and ease of fabrication. Although basic techniques for the preparation of regular patterns of colloidal crystals on structured substrates by self-assembly of mesoscopic particles are known, the efficient fabrication of colloidal crystal arrays by simple contact printing has not yet been reported. In this article, we present a spotting technique used to produce a microarray comprising up to 9600 single addressable sensor fields of colloidal crystal structures with dimensions down to 100 mum on a microfabricated substrate in different formats. Both monodisperse colloidal crystals and binary colloidal crystal systems were prepared by contact printing of polystyrene particles in aqueous suspension. The array morphology was characterized by optical light microscopy and scanning electron microscopy, which revealed regularly ordered crystalline structures for both systems. In the case of binary crystals, the influence of the concentration ratio of the large and small particles in the printing suspension on the obtained crystal structure was investigated. The optical properties of the colloidal crystal arrays were characterized by reflection spectroscopy. To examine the stop bands of the colloidal crystal arrays in a high-throughput fashion, an optical setup based on a CCD camera was realized that allowed the simultaneous readout of all of the reflection spectra of several thousand sensor fields per array in parallel. In agreement with Bragg's relation, the investigated arrays exhibited strong opalescence and stop bands in the expected wavelength range, confirming the successful formation of highly ordered colloidal crystals. Furthermore, a narrow distribution of wavelength-dependent stop bands across the sensor array was achieved, demonstrating the capability of producing highly reproducible crystal spots by the contact printing method with a pintool plotter.

  12. KA-102 Film/EO Standoff System

    NASA Astrophysics Data System (ADS)

    Turpin, Richard T.

    1984-12-01

    The KA-102 is an in-flight selectable film or electro-optic (EU) visible reconnaissance camera with a real-time data link. The lens is a 66-in., f/4 refractor with a 4° field-of-view. The focal plane is a continuous line array of 10,240 COD elements that opera tes in the pushbroom mode. In the film mode, the camera use standard 5-in.-wide 3414 or 3412 film. The E0 imagery is transmitted up to 500 n.mi. to the ground station over a 75-Mbit/sec )(- band data link via a relay aircraft (see Figure 1). The camera may be controlled from the ground station via an uplink or from the cockpit control panel. The 8-ft-diameter ground tracking antenna is located on high ground and linked to the ground station via a 1-mile-long, two-way fiber optic system. In the ground station the imagery is calibrated and displayed in real time on three crt's. Selected imagery may be stored on disk and enhanced, analyzed, and annotated in near-real-time. The imagery may be enhanced and magnified in real time. Hardcopy frames may be made on 8 x 10-in. Polaroid, 35-1m film, or dry silver paper. All the received image and engineering data is recorded on a high-density tape recorder. The aircraft track is recorded on a map plotter. Ground support equipment (GSE), manuals, spares, and training are included in the system. Falcon 20 aircraft were modified on a subcontract to Dynelectron--Ft. Worth.

  13. Photogrammetry of the Viking Lander imagery

    NASA Technical Reports Server (NTRS)

    Wu, S. S. C.; Schafer, F. J.

    1982-01-01

    The problem of photogrammetric mapping which uses Viking Lander photography as its basis is solved in two ways: (1) by converting the azimuth and elevation scanning imagery to the equivalent of a frame picture, using computerized rectification; and (2) by interfacing a high-speed, general-purpose computer to the analytical plotter employed, so that all correction computations can be performed in real time during the model-orientation and map-compilation process. Both the efficiency of the Viking Lander cameras and the validity of the rectification method have been established by a series of pre-mission tests which compared the accuracy of terrestrial maps compiled by this method with maps made from aerial photographs. In addition, 1:10-scale topographic maps of Viking Lander sites 1 and 2 having a contour interval of 1.0 cm have been made to test the rectification method.

  14. Nova Sco 2016 No. 2 = PNV J17225112-3158349 = ASASSN-16kd

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-09-01

    AAVSO Alert Notice 550 announces the independent discovery of Nova Sco 2016 No. 2 = ASASSN-16kd = PNV J17225112-3158349 = V1656 Sco by Shigehisa Fujikawa (Kan'onji, Kagawa, Japan) at unfiltered CCD magnitude 11.6 on 2016 September 06.481 UT; and by ASAS-SN (Stanek et al., ATel #9469) at 12.13 V on 2016 September 06.00 UT. Spectroscopy indicating that Nova Sco 2016 No. 2 is a highly reddened classical Fe II-type nova was obtained by Arai and Honda (CBET 4320); by Bohlsen (ATel #9477); by Bersier et al. (ATel #9478); and by Prieto et al. (ATel #9479). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  15. CHiCP: a web-based tool for the integrative and interactive visualization of promoter capture Hi-C datasets.

    PubMed

    Schofield, E C; Carver, T; Achuthan, P; Freire-Pritchett, P; Spivakov, M; Todd, J A; Burren, O S

    2016-08-15

    Promoter capture Hi-C (PCHi-C) allows the genome-wide interrogation of physical interactions between distal DNA regulatory elements and gene promoters in multiple tissue contexts. Visual integration of the resultant chromosome interaction maps with other sources of genomic annotations can provide insight into underlying regulatory mechanisms. We have developed Capture HiC Plotter (CHiCP), a web-based tool that allows interactive exploration of PCHi-C interaction maps and integration with both public and user-defined genomic datasets. CHiCP is freely accessible from www.chicp.org and supports most major HTML5 compliant web browsers. Full source code and installation instructions are available from http://github.com/D-I-L/django-chicp ob219@cam.ac.uk. © The Author 2016. Published by Oxford University Press. All rights reserved.

  16. Mission Operations Control Room Activities during STS-2 mission

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Mission Operations Control Room (MOCR) activities during STS-2 mission. President Ronald Reagan is briefed by Dr. Christopher C. Kraft, Jr., JSC Director, who points toward the orbiter spotter on the projection plotter at the front of the MOCR (39499); President Reagan joking with STS-2 astronauts during space to ground conversation (39500); Mission Specialist/Astronaut Sally K. Ride communicates with the STS-2 crew from the spacecraft communicator console (39501); Charles R. Lewis, bronze team Flight Director, monitors activity from the STS-2 crew. He is seated at the flight director console in MOCR (39502); Eugene F. Kranz, Deputy Director of Flight Operations at JSC answers a question during a press conference on Nov. 13, 1981. He is flanked by Glynn S. Lunney, Manager, Space Shuttle Program Office, JSC; and Dr. Christopher C. Kraft, Jr., Director of JSC (39503).

  17. SD-4060OCPLT4 program, user's guide

    NASA Technical Reports Server (NTRS)

    Glazer, J.

    1973-01-01

    A brief description of the Orbit Comparison Plot (OCPLT4) program is presented, along with user information and a source program listing. In addition to correcting several errors that existed in the original program, this program incorporates the following new features: (1) For any satellite whose observations are processed by the Definitive Orbit Determination System (DODS), the orbital uncertainty estimates (OUE) can be obtained via appropriate card input with no major modification to the program. (2) All satellite-related information (e.g., plotter scales, cutoff limits, plotting frequencies) is user controlled via card input. (3) Not all components of OUE must be obtained. The user has the option of obtaining only the radial component if there is no need for the other two components. (4) The altitude and time graph formats are controlled by the user and are not stored for specific satellites.

  18. Nova Lupi 2011

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2011-08-01

    Announcement of discovery of Nova Lupi 2011 = PNV J14542000-5505030. Discovered by Nicholas Brown (Quinns Rocks, Western Australia) on 2011 Aug. 4.73 UT at unfiltered mag=10.2 (tmax 400 film). Posted on the IAU Central Bureau for Astronomical Telegrams Transient Object Confirmation Page (TOCP) as PNV J14542000-5505030. Spectra obtained by Fred Walter (SUNY Stony Brook) 2011 August 9.0132 UT with the SMARTS 1.5m RC spectrograph at Cerro Tololo and reported in ATEL #3536 confirms that the object is an Fe II nova near maximum. Initially announced in [vsnet-alert 13560] (Nicholas Brown) and in AAVSO Special Notice #247 (Arne Henden). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and observations.

  19. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  20. EE Cep observations requested for upcoming eclipse

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2014-07-01

    The AAVSO requests observations for the upcoming eclipse of EE Cephei, a long-period eclipsing variable. EE Cep has a period of 2,050 days, and shows strong variations in the eclipse light curve from one event to the next. Observations are needed to study the morphology of the upcoming eclipse, which will be used to better understand the shape of the eclipsing disk and how it precesses. Mid-eclipse is predicted to be August 23, 2014, but the early stages of the eclipse may begin as much as a month earlier. EE Cep is being observed by a number of amateur and professional astronomers using multiple telescopes at multiple wavelengths. Among these is a collaboration (see https://sites.google.com/site/eecep2014campaign/) headed by Cezary Galan at the Nicolaus Copernicus Astronomical Center in Poland; several individual AAVSO observers are already participating in this effort. The AAVSO is not currently a partner in that campaign, but all data submitted to the AAVSO will be publicly available. The AAVSO strongly encourages observers to begin following this star now, and to continue observations into October 2014 at least. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and observations.

  1. Finite difference model for aquifer simulation in two dimensions with results of numerical experiments

    USGS Publications Warehouse

    Trescott, Peter C.; Pinder, George Francis; Larson, S.P.

    1976-01-01

    The model will simulate ground-water flow in an artesian aquifer, a water-table aquifer, or a combined artesian and water-table aquifer. The aquifer may be heterogeneous and anisotropic and have irregular boundaries. The source term in the flow equation may include well discharge, constant recharge, leakage from confining beds in which the effects of storage are considered, and evapotranspiration as a linear function of depth to water. The theoretical development includes presentation of the appropriate flow equations and derivation of the finite-difference approximations (written for a variable grid). The documentation emphasizes the numerical techniques that can be used for solving the simultaneous equations and describes the results of numerical experiments using these techniques. Of the three numerical techniques available in the model, the strongly implicit procedure, in general, requires less computer time and has fewer numerical difficulties than do the iterative alternating direction implicit procedure and line successive overrelaxation (which includes a two-dimensional correction procedure to accelerate convergence). The documentation includes a flow chart, program listing, an example simulation, and sections on designing an aquifer model and requirements for data input. It illustrates how model results can be presented on the line printer and pen plotters with a program that utilizes the graphical display software available from the Geological Survey Computer Center Division. In addition the model includes options for reading input data from a disk and writing intermediate results on a disk.

  2. Request for Observations of V405 Peg

    NASA Astrophysics Data System (ADS)

    Templeton, Matthew R.

    2009-12-01

    Dr. Axel Schwope (Astrophysikalisches Institut Potsdam) requests time-series monitoring of the magnetic cataclysmic variable V405 Pegasi from 2009 December 28 through 2009 December 30. These observations are requested in support of a planned XMM-Newton observation of V405 Peg on 2009 December 29 beginning at 18:51 UT (JD 2455195.2854) and continuing for 12.5 hours. Observers are asked to provide intensive coverage during the three day window centered on the XMM-Newton observation to provide information on the activity state of V405 Peg, to improve the orbital ephemeris, and to provide optical data that will help constrain the spectral energy distribution of this poorly understood cataclysmic variable. The primary filters for this observation are Johnson B and Cousins I, but all observations will be useful for determining the orbital ephemeris. V405 Peg may show both orbital modulation as well as changes in its activity level. The orbital period is approximately four hours, and observers are asked to obtain at least ten and preferably more data points per cycle in each filter. Please use exposure times that provide S/N of at least 20 in both the comparison and target stars but short exposure times are preferred to detect flickering and other short-timescale variations. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  3. An anesthesia information system for monitoring and record keeping during surgical anesthesia.

    PubMed

    Klocke, H; Trispel, S; Rau, G; Hatzky, U; Daub, D

    1986-10-01

    We have developed an anesthesia information system (AIS) that supports the anesthesiologist in monitoring and recording during a surgical operation. In development of the system, emphasis was placed on providing an anesthesiologist-computer interface that can be adapted to typical situations during anesthesia and to individual user behavior. One main feature of this interface is the integration of the input and output of information. The only device for interaction between the anesthesiologist and the AIS is a touch-sensitive, high-resolution color display screen. The anesthesiologist enters information by touching virtual function keys displayed on the screen. A data window displays all data generated over time, such as automatically recorded vital signs, including blood pressure, heart rate, and rectal and esophageal temperatures, and manually entered variables, such as administered drugs, and ventilator settings. The information gathered by the AIS is presented on the cathode ray tube in several pages. A main distributor page gives an overall view of the content of every work page. A one-page record of the anesthesia is automatically plotted on a multicolor digital plotter during the operation. An example of the use of the AIS is presented from a field test of the system during which it was evaluated in the operating room without interfering with the ongoing operation. Medical staff who used the AIS imitated the anesthesiologist's recording and information search behavior but did not have responsibility for the conduct of the anesthetic.

  4. Data acquisition system of 16-channel EEG based on ATSAM3X8E ARM Cortex-M3 32-bit microcontroller and ADS1299

    NASA Astrophysics Data System (ADS)

    Toresano, L. O. H. Z.; Wijaya, S. K.; Prawito, Sudarmaji, A.; Badri, C.

    2017-07-01

    The prototype of the EEG (electroencephalogram) instrumentation systems has been developed based on 32-bit microcontrollers of Cortex-M3 ATSAM3X8E and Analog Front-End (AFE) ADS1299 (Texas Instruments, USA), and also consists of 16-channel dry-electrodes in the form of EEG head-caps. The ADS1299-AFE has been designed in a double-layer format PCB (Print Circuit Board) with daisy-chain configuration. The communication protocol of the prototype was based on SPI (Serial Peripheral Interface) and tested using USB SPI-Logic Analyzer Hantek4032L (Qingdao Hantek Electronic, China). The acquired data of the 16-channel from this prototype has been successfully transferred to a PC (Personal Computer) with accuracy greater than 91 %. The data acquisition system has been visualized with time-domain format in the multi-graph plotter, the frequency-domain based on FFT (Fast Fourier Transform) calculation, and also brain-mapping display of 16-channel. The GUI (Graphical User Interface) has been developed based on OpenBCI (Brain Computer Interface) using Java Processing and also can be stored of data in the *.txt format. Instrumentation systems have been tested in the frequency range of 1-50 Hz using MiniSim 330 EEG Simulator (NETECH, USA). The validation process has been done with different frequency of 0.1 Hz, 2 Hz, 5 Hz, and 50 Hz, and difference voltage amplitudes of 10 µV, 30 µV, 50 µV, 100 µV, 500 µV, 1 mV, 2 mV and 2.5 mV. However, the acquisition system was not optimal at a frequency of 0.1 Hz and for amplitude potentials of over 1 mV had differences of the order 10 µV.

  5. Mission Operations Control Room Activities during STS-2 mission

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Mission Operations Control Room (MOCR) activities during STS-2 mission. President Ronald Reagan and Dr. Christopher C. Kraft, Jr., look toward the orbiter spotter on the projection plotter at the front of the MOCR. Also present are Astronaut Daniel C. Brandenstein, seated left, and NASA Administrator James M. Beggs standing left of center. In the foreground, Dr. Hans Mark, Deputy NASA Administrator, briefs Michael Deaver, Special Assistant to President Reagan (39504); President Reagan speaks to the STS-2 crew during the second day of their mission. On hand in MOCR were NASA Administrator James M. Beggs and Deputy Administrator Hans Mark (standing behind the president but mostly out of frame) and Dr. Kraft on the right. Eugene F. Kranz, Deputy Director of Flight Operations can be seen in the background seated at the Flight Operations Directorate (FOD) console. Also present is Astronaut Daniel C. Brandenstein, seated left, who turned the communications over to Mr. Reagan (39505).

  6. Laserprinter applications in a medical graphics department.

    PubMed

    Lynch, P J

    1987-01-01

    Our experience with the Apple Macintosh and LaserWriter equipment has convinced us that lasergraphics holds much current and future promise in the creation of line graphics and typography for the biomedical community. Although we continue to use other computer graphics equipment to produce color slides and an occasional pen-plotter graphic, the most rapidly growing segment of our graphics workload is in material well-suited to production on the Macintosh/LaserWriter system. At present our goal is to integrate all of our computer graphics production (color slides, video paint graphics and monochrome print graphics) into a single Macintosh-based system within the next two years. The software and hardware currently available are capable of producing a wide range of science graphics very quickly and inexpensively. The cost-effectiveness, versatility and relatively low initial investment required to install this equipment make it an attractive alternative for cost-recovery departments just entering the field of computer graphics.

  7. Thermal Transfer Compared To The Fourteen Other Imaging Technologies

    NASA Astrophysics Data System (ADS)

    O'Leary, John W.

    1989-07-01

    A quiet revolution in the world of imaging has been underway for the past few years. The older technologies of dot matrix, daisy wheel, thermal paper and pen plotters have been increasingly displaced by laser, ink jet and thermal transfer. The net result of this revolution is improved technologies that afford superior imaging, quiet operation, plain paper usage, instant operation, and solid state components. Thermal transfer is one of the processes that incorporates these benefits. Among the imaging application for thermal transfer are: 1. Bar code labeling and scanning. 2. New systems for airline ticketing, boarding passes, reservations, etc. 3. Color computer graphics and imaging. 4. Copying machines that copy in color. 5. Fast growing communications media such as facsimile. 6. Low cost word processors and computer printers. 7. New devices that print pictures from video cameras or television sets. 8. Cameras utilizing computer chips in place of film.

  8. Supernova 2011by in NGC 3972 = Psn J11554556+5519338

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2011-04-01

    Announces discovery of SN 2011by = PSN J11554556+5519338 by Zhangwei Jin (Ningbo, Zhejiang, China) and Xing Gao (Urumqi, Xinjiang, China) on 2011 Apr. 26.8234 UT at magnitude ~14.2 (unfiltered CCD). Spectra obtained on 2011 Apr. 27.5 UT by T. Zhang and Z. Zhou (National Astronomical Observatories of China) and X. Wang (Tsinghua Center for Astrophysics, Tsinghua University) show SN 2011by to be a type-Ia supernova about 10 days before maximum. Initially announced in IAU CBAT Central Bureau Electronic Telegrams 2708 (Daniel W. E. Green, ed.). The object was designated PSN J11554556+5519338 when posted on the Central Bureau's Transient Objects Confirmation Page (TOCP) webpage. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details, observations, and links to images.

  9. Topographic mapping of the Moon

    USGS Publications Warehouse

    Wu, S.S.C.

    1985-01-01

    Contour maps of the Moon have been compiled by photogrammetric methods that use stereoscopic combinations of all available metric photographs from the Apollo 15, 16, and 17 missions. The maps utilize the same format as the existing NASA shaded-relief Lunar Planning Charts (LOC-1, -2, -3, and -4), which have a scale of 1:2 750 000. The map contour interval is 500m. A control net derived from Apollo photographs by Doyle and others was used for the compilation. Contour lines and elevations are referred to the new topographic datum of the Moon, which is defined in terms of spherical harmonics from the lunar gravity field. Compilation of all four LOC charts was completed on analytical plotters from 566 stereo models of Apollo metric photographs that cover approximately 20% of the Moon. This is the first step toward compiling a global topographic map of the Moon at a scale of 1:5 000 000. ?? 1985 D. Reidel Publishing Company.

  10. Art in the Digital Age

    PubMed

    2016-01-01

    The genre of “computer art” began in the 1950s, when long exposure photography was used to capture images created by an oscilloscope manipulating electronic waves on a small fluorescent screen. Through the 1960s, most works of computer art were created using plotters and impact printers by the scientists and engineers who had access to emerging computing technology. By the 1970s, artists were learning to program, and some universities began to integrate computers into the fine arts curriculum. The widespread adoption of computers and the availability of off-the-shelf paint programs in the 1980s brought computer art to the masses. At the same time, computer graphics and special effects were beginning their takeover of the entertainment industry through Hollywood films, TV shows, and video games. By the 1990s, the term computer art was fading, and computers were becoming a mainstream part of arts and entertainment.

  11. ObsPy - A Python Toolbox for Seismology - and Applications

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Barsch, R.; MacCarthy, J.; Lecocq, T.; Koymans, M. R.; Carothers, L.; Eulenfeld, T.; Reyes, C. G.; Falco, N.; Sales de Andrade, E.

    2017-12-01

    Recent years witnessed the evolution of Python's ecosystem into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It is a Python toolbox offering: Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, SC3ML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. Newest features include: Full interoperability of SEED and StationXML/Inventory objects Access to the Nominal Response Library (NRL) for easy and quick creation of station metadata from scratch Support for the IRIS Federated Catalog Service Improved performance of the EarthWorm client Several improvements to MiniSEED read/write module Improved plotting capabilities for PPSD (spectrograms, PSD of discrete frequencies over time, ..) Support for.. Reading ArcLink Inventory XML Reading Reftek data format Writing SeisComp3 ML (SC3ML) Writing StationTXT format This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases and show-case some projects that are based on ObsPy, e.g.: seismo-live.org Seedlink-plotter MSNoise, and others..

  12. Antarctic Meteorite Location and Mapping Project (AMLAMP): Antarctic meteorite location map series explanatory text and user's guide to AMLAMP data

    NASA Technical Reports Server (NTRS)

    Schutt, J.; Fessler, B.; Cassidy, W. A.

    1993-01-01

    This technical report is an update to LPI Technical Report 89-02, which contained data and information that was current to May 1987. Since that time approximately 4000 new meteorites have been collected, mapped, and characterized, mainly from the numerous ice fields in the Allan Hills-David Glacier region, from the Pecora Escarpment and Moulton Escarpment in the Thiel Mountains-Patuxent region, the Wisconsin Range region, and from the Beardmore region. Meteorite location maps for ice fields from these regions have been produced and are available. This report includes explanatory texts for the maps of new areas and provides information on updates of maps of the areas covered in LPI Technical Report 89-02. Sketch maps and description of locales that have been searched and have yielded single or few meteorites are also included. The meteorite listings for all the ice fields have been updated to include any classification changes and new meteorites recovered from ice fields in the Allan Hills-David Glacier region since 1987. The text has been reorganized and minor errors in the original report have been corrected. Computing capabilities have improved immensely since the early days of this project. Current software and hardware allow easy access to data over computer networks. With various commercial software packages, the data can be used many different ways, including database creation, statistics, and mapping. The databases, explanatory texts, and the plotter files used to produce the meteorite location maps are available through a computer network. Information on how to access AMLAMP data, its formats, and ways it can be used are given in the User's Guide to AMLAMP Data section. Meteorite location maps and thematic maps may be ordered from the Lunar and Planetary Institute. Ordering information is given in Appendix A.

  13. Oncoprotein HBXIP enhances HOXB13 acetylation and co-activates HOXB13 to confer tamoxifen resistance in breast cancer.

    PubMed

    Liu, Bowen; Wang, Tianjiao; Wang, Huawei; Zhang, Lu; Xu, Feifei; Fang, Runping; Li, Leilei; Cai, Xiaoli; Wu, Yue; Zhang, Weiying; Ye, Lihong

    2018-02-23

    Resistance to tamoxifen (TAM) frequently occurs in the treatment of estrogen receptor positive (ER+) breast cancer. Accumulating evidences indicate that transcription factor HOXB13 is of great significance in TAM resistance. However, the regulation of HOXB13 in TAM-resistant breast cancer remains largely unexplored. Here, we were interested in the potential effect of HBXIP, an oncoprotein involved in the acceleration of cancer progression, on the modulation of HOXB13 in TAM resistance of breast cancer. The Kaplan-Meier plotter cancer database and GEO dataset were used to analyze the association between HBXIP expression and relapse-free survival. The correlation of HBXIP and HOXB13 in ER+ breast cancer was assessed by human tissue microarray. Immunoblotting analysis, qRT-PCR assay, immunofluorescence staining, Co-IP assay, ChIP assay, luciferase reporter gene assay, cell viability assay, and colony formation assay were performed to explore the possible molecular mechanism by which HBXIP modulates HOXB13. Cell viability assay, xenograft assay, and immunohistochemistry staining analysis were utilized to evaluate the effect of the HBXIP/HOXB13 axis on the facilitation of TAM resistance in vitro and in vivo. The analysis of the Kaplan-Meier plotter and the GEO dataset showed that mono-TAM-treated breast cancer patients with higher HBXIP expression levels had shorter relapse-free survivals than patients with lower HBXIP expression levels. Overexpression of HBXIP induced TAM resistance in ER+ breast cancer cells. The tissue microarray analysis revealed a positive association between the expression levels of HBXIP and HOXB13 in ER+ breast cancer patients. HBXIP elevated HOXB13 protein level in breast cancer cells. Mechanistically, HBXIP prevented chaperone-mediated autophagy (CMA)-dependent degradation of HOXB13 via enhancement of HOXB13 acetylation at the lysine 277 residue, causing the accumulation of HOXB13. Moreover, HBXIP was able to act as a co-activator of HOXB13 to stimulate interleukin (IL)-6 transcription in the promotion of TAM resistance. Interestingly, aspirin (ASA) suppressed the HBXIP/HOXB13 axis by decreasing HBXIP expression, overcoming TAM resistance in vitro and in vivo. Our study highlights that HBXIP enhances HOXB13 acetylation to prevent HOXB13 degradation and co-activates HOXB13 in the promotion of TAM resistance of breast cancer. Therapeutically, ASA can serve as a potential candidate for reversing TAM resistance by inhibiting HBXIP expression.

  14. The Mars mapper science and mission planning tool

    NASA Technical Reports Server (NTRS)

    Lo, Martin W.

    1993-01-01

    The Mars Mapper Program (MOm) is an interactive tool for science and mission design developed for the Mars Observer Mission (MO). MOm is a function of the Planning and Sequencing Element of the MO Ground Data System. The primary users of MOm are members of the science and mission planning teams. Using MOm, the user can display digital maps of Mars in various projections and resolutions ranging from 1 to 256 pixels per degree squared. The user can overlay the maps with ground tracks of the MO spacecraft (S/C) and footprints and swaths of the various instruments on-board the S/C. Orbital and instrument geometric parameters can be computed on demand and displayed on the digital map or plotted in XY-plots. The parameter data can also be saved into files for other uses. MOm is divided into 3 major processes: Generator, Mapper, Plotter. The Generator Process is the main control which spawns all other processes. The processes communicate via sockets. At any one time, only 1 copy of MOm may operate on the system. However, up to 5 copies of each of the major processes may be invoked from the Generator. MOm is developed on the Sun SPARCStation 2GX with menu driven graphical user interface (GUI). The map window and its overlays are mouse-sensitized to permit on-demand calculations of various parameters along an orbit. The program is currently under testing and will be delivered to the MO Mission System Configuration Management for distribution to the MO community in 3/93.

  15. Digital classification of Landsat data for vegetation and land-cover mapping in the Blackfoot River watershed, southeastern Idaho

    USGS Publications Warehouse

    Pettinger, L.R.

    1982-01-01

    This paper documents the procedures, results, and final products of a digital analysis of Landsat data used to produce a vegetation and landcover map of the Blackfoot River watershed in southeastern Idaho. Resource classes were identified at two levels of detail: generalized Level I classes (for example, forest land and wetland) and detailed Levels II and III classes (for example, conifer forest, aspen, wet meadow, and riparian hardwoods). Training set statistics were derived using a modified clustering approach. Environmental stratification that separated uplands from lowlands improved discrimination between resource classes having similar spectral signatures. Digital classification was performed using a maximum likelihood algorithm. Classification accuracy was determined on a single-pixel basis from a random sample of 25-pixel blocks. These blocks were transferred to small-scale color-infrared aerial photographs, and the image area corresponding to each pixel was interpreted. Classification accuracy, expressed as percent agreement of digital classification and photo-interpretation results, was 83.0:t 2.1 percent (0.95 probability level) for generalized (Level I) classes and 52.2:t 2.8 percent (0.95 probability level) for detailed (Levels II and III) classes. After the classified images were geometrically corrected, two types of maps were produced of Level I and Levels II and III resource classes: color-coded maps at a 1:250,000 scale, and flatbed-plotter overlays at a 1:24,000 scale. The overlays are more useful because of their larger scale, familiar format to users, and compatibility with other types of topographic and thematic maps of the same scale.

  16. PLOT3D- DRAWING THREE DIMENSIONAL SURFACES

    NASA Technical Reports Server (NTRS)

    Canright, R. B.

    1994-01-01

    PLOT3D is a package of programs to draw three-dimensional surfaces of the form z = f(x,y). The function f and the boundary values for x and y are the input to PLOT3D. The surface thus defined may be drawn after arbitrary rotations. However, it is designed to draw only functions in rectangular coordinates expressed explicitly in the above form. It cannot, for example, draw a sphere. Output is by off-line incremental plotter or online microfilm recorder. This package, unlike other packages, will plot any function of the form z = f(x,y) and portrays continuous and bounded functions of two independent variables. With curve fitting; however, it can draw experimental data and pictures which cannot be expressed in the above form. The method used is division into a uniform rectangular grid of the given x and y ranges. The values of the supplied function at the grid points (x, y) are calculated and stored; this defines the surface. The surface is portrayed by connecting successive (y,z) points with straight-line segments for each x value on the grid and, in turn, connecting successive (x,z) points for each fixed y value on the grid. These lines are then projected by parallel projection onto the fixed yz-plane for plotting. This program has been implemented on the IBM 360/67 with on-line CDC microfilm recorder.

  17. Mars synthetic topographic mapping

    USGS Publications Warehouse

    Wu, S.S.C.

    1978-01-01

    Topographic contour maps of Mars are compiled by the synthesis of data acquired from various scientific experiments of the Mariner 9 mission, including S-band radio-occulation, the ultraviolet spectrometer (UVS), the infrared radiometer (IRR), the infrared interferometer spectrometer (IRIS) and television imagery, as well as Earth-based radar information collected at Goldstone, Haystack, and Arecibo Observatories. The entire planet is mapped at scales of 1:25,000,000 and 1:25,000,000 using Mercator, Lambert, and polar stereographic map projections. For the computation of map projections, a biaxial spheroid figure is adopted. The semimajor and semiminor axes are 3393.4 and 3375.7 km, respectively, with a polar flattening of 0.0052. For the computation of elevations, a topographic datum is defined by a gravity field described in terms of spherical harmonics of fourth order and fourth degree combined with a 6.1-mbar occulation pressure surface. This areoid can be approximated by a triaxial ellipsoid with semimajor axes of A = 3394.6 km and B = 3393.3 km and a semiminor axis of C = 3376.3 km. The semimajor axis A intersects the Martian surface at longitude 105??W. The dynamic flattening of Mars is 0.00525. The contour intercal of the maps is 1 km. For some prominent features where overlapping pictures from Mariner 9 are available, local contour maps at relatively larger scales were also compiled by photogrammetric methods on stereo plotters. ?? 1978.

  18. Observations of CI Cam needed to support spectroscopy

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-10-01

    Kelly Gourdji and Marcella Wijngaarden (graduate students at the University of Amsterdam/Anton Pannekoek Institute for Astronomy) have requested AAVSO observers' assistance in providing optical photometry of CI Cam in support of their high-resolution spectroscopy from now through January 2017. They write: "...We are currently observing the variable star CI Cam (the B[e] optical counterpart of a HMXB system) with the HERMES spectrograph at the Mercator Telescope in La Palma. Having observed the star for three nights now, the object appears to be in outburst. In particular, H alpha was measured to be 80 times the continuum flux, and increasing between Oct. 9 and 12. This is similar to the previous outburst in 2004/5. Photometric data obtained during the 2004/5 outburst suggested an outburst duration of about 3 months and a peak brightness of 11.2 in the V band." More information is available in ATel #9634 (Wijngaarden et al.). Multiple snapshot observations per night in BVRI are requested beginning immediately and continuing through January 2017. Time series are not necessary unless requested later via an AAVSO Special Notice. Observations made using other filters will be useful as well as long as there are multiple observations in these bands. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  19. Direct Simple Shear Test Data Analysis using Jupyter Notebooks on DesignSafe-CI

    NASA Astrophysics Data System (ADS)

    Eslami, M.; Esteva, M.; Brandenberg, S. J.

    2017-12-01

    Due to the large number of files and their complex structure, managing data generated during natural hazards experiments requires scalable and specialized tools. DesignSafe-CI (https://www.designsafe-ci.org/) is a web-based research platform that provides computational tools to analyze, curate, and publish critical data for natural hazards research making it understandable and reusable. We present a use case from a series of Direct Simple Shear (DSS) experiments in which we used DS-CI to post-process, visualize, publish, and enable further analysis of the data. Current practice in geotechnical design against earthquakes relies on the soil's plasticity index (PI) to assess liquefaction susceptibility, and cyclic softening triggering procedures, although, quite divergent recommendations on recommended levels of plasticity can be found in the literature for these purposes. A series of cyclic and monotonic direct simple shear experiments was conducted on three low-plasticity fine-grained mixtures at the same plasticity index to examine the effectiveness of the PI in characterization of these types of materials. Results revealed that plasticity index is an insufficient indicator of the cyclic behavior of low-plasticity fine-grained soils, and corrections for pore fluid chemistry and clay minerology may be necessary for future liquefaction susceptibility and cyclic softening assessment procedures. Each monotonic, or cyclic experiment contains two stages; consolidation and shear, which include time series of load, displacement, and corresponding stresses and strains, as well as equivalent excess pore-water pressure. Using the DS-CI curation pipeline we categorized the data to display and describe the experiment's structure and files corresponding to each stage of the experiments. Two separate notebooks in Python 3 were created using the Jupyter application available in DS-CI. A data plotter aids visualizing the experimental data in relation to the sensor from which it was generated. The analysis notebook allows combining outcomes of multiple tests, conducting diverse analyses to find critical parameters, and developing plots at arbitrary strain levels. Using the platform aids both researchers work with the data and those reusing it.

  20. COMPUTER DATA PROCESSING SYSTEM. PROJECT ROVER, 1962

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narin, F.

    ABS>A system was created for processing large volumes of data from Project ROVER tests at the Nevada Test Site. The data are compiled as analog, frequency modulated tape, which is translated in a Packard-Bell Tape-to-Tape converter into a binary coded decimal (BCD) IBM 7090 computer input tape. This input tape, tape A5, is processed on the 7090 by the RDH-D FORTRAN-II code and its 20 FAP and FORTRAN subroutines. Outputs from the 7090 run are tapes A3, which is a BCD tape used for listing on the IBM 1401 input-output computer, tape B5 which is a binary tape used asmore » input to a Stromberg-Carlson 40/20 cathode ray tube (CRT) plotter, and tape B6 which is a binary tape used for permanent data storage and input to specialized subcodes. The information on tape B5 commands the 40/20 to write grids, data points, and other information on the face of a CRT; the information on the CRT is photographed on 35 mm film which is subsequently developed; full-size (10" x 10") plots are made from the 35 mm film on a Xerox 1824 printer. The 7090 processes a data channel in approximately 4 seconds plus 4 seconds per plot to be made on the 40/20 for that channel. Up to 4500 data and calibration points on any one channel may be processed in one pass of the RDH-D code. This system has been used to produce more than 100,000 prints on the 1824 printer from more than 10,000 different 40/20 plots. At 00 per minute of 7090 time, it costs 60 to process a typical, 3-plot data channel on the 7090; each print on the 1824 costs between 5 and 10 cents including rental, supplies, and operator time. All automatic computer stops in the codes and subroutines are accompanied by on-line instructions to the operator. Extensive redundancy checking is incorporated in the FAP tape handling subroutines. (auth)« less

  1. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  2. Request for regular monitoring of the symbiotic variable RT Cru

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2014-08-01

    Dr. Margarita Karovska (Harvard-Smithsonian Center for Astrophysics) and colleagues have requested AAVSO observer assistance in their campaign on the symbiotic variable RT Cru (member of a new class of hard X-ray emitting symbiotic binaries). Weekly or more frequent monitoring (B, V, and visual) beginning now is requested in support of upcoming Chandra observations still to be scheduled. "We plan Chandra observations of RT Cru in the near future that will help us understand the characteristics of the accretion onto the white dwarf in this sub-class of symbiotics. This is an important step for determining the precursor conditions for formation of a fraction of asymmetric Planetary Nebulae, and the potential of symbiotic systems as progenitors of at least a fraction of Type Ia supernovae." Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and observations.

  3. Rotordynamics on the PC: Transient Analysis With ARDS

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    1997-01-01

    Personal computers can now do many jobs that formerly required a large mainframe computer. An example is NASA Lewis Research Center's program Analysis of RotorDynamic Systems (ARDS), which uses the component mode synthesis method to analyze the dynamic motion of up to five rotating shafts. As originally written in the early 1980's, this program was considered large for the mainframe computers of the time. ARDS, which was written in Fortran 77, has been successfully ported to a 486 personal computer. Plots appear on the computer monitor via calls programmed for the original CALCOMP plotter; plots can also be output on a standard laser printer. The executable code, which uses the full array sizes of the mainframe version, easily fits on a high-density floppy disk. The program runs under DOS with an extended memory manager. In addition to transient analysis of blade loss, step turns, and base acceleration, with simulation of squeeze-film dampers and rubs, ARDS calculates natural frequencies and unbalance response.

  4. V390 Nor = Nova Normae 2007

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2007-06-01

    Nova Normae 2007 was discovered photographically by William Liller on June 15.086 UT at magnitude 9.4. Precise position measured by G. Bolt from his unfiltered CCD image of June 16.7 UT: 16:32:11.51 -45:09:13.4 (2000.0). Giorgio Di Scala reported to the AAVSO that a low-resolution spectrum indicates a nova a week or so after outburst, with strong H-alpha emission. E. Kazarovets, Sternberg Astronomical Institute, reports that N Nor 07 has been assigned the name V390 Nor. Discovery originally announced in IAU Central Bureau Electronic Telegram 982 (Daniel W. E. Green) and AAVSO Special Notice #49 (Arne Henden). Information in this Alert Notice was received at AAVSO from William Liller, Giorgio Di Scala, or via IAU Circular No. 8850, ed. Daniel W. E. Green. A chart for V390 Nor is available via the Variable Star Plotter (VSP). Go to: http://www.aavso.org/observing/charts/vsp/ and enter the name V390 NOR.

  5. Nova Scorpii 2011 = PNV J16551100-3838120

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2011-06-01

    Announces the discovery of Nova Scorpii 2011 = PNV J16551100-3838120 by John Seach (Chatsworth Island, NSW, Australia) on 2011 June 1.40 UT at magnitude 9.5 (DSLR + orange filter). Spectra by Bernard Heathcote (South Yarra, Vic, Australia) on Jun 2.4896 UT, A. Arai, T. Kajikawa, and M. Nagashima (Kyoto Sangyo University, Japan) on 2011 June 2.68 UT, and Masayuki Yamanaka and Ryosuke Itoh (Hiroshima University, Japan) on Jun 2 UT indicate a highly-reddened classical nova. Initially reported to the AAVSO by Seach and announced in AAVSO Special Notice #240 (Arne Henden) and IAU CBET 2735 (Daniel W. E. Green, ed.). The object was designated PNV J18102135-2305306 when posted on the Central Bureau's Transient Objects Confirmation Page (TOCP) webpage. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details, observations, and links to images. [Nova Sco 2011 subsequently assigned the name V1312 Sco

  6. The Position Control of the Surface Motor with the Poles Distribution of Triangular Lattice

    NASA Astrophysics Data System (ADS)

    Watada, Masaya; Katsuyama, Norikazu; Ebihara, Daiki

    Recently, as for the machine tools or industrial robots, high performance, accuracy, etc. are demanded. Generally, when drive of many degrees of freedom is required in the machine tools or industrial robots, it has realized by using two or more motors. For example, two-dimensional positioning stages such as the X-Y plotter or the X-Y stage are enabling the two-dimensional drive by using each one motor in the direction of x, y. In order to use plural motors, these, however, have problems that equipment becomes large and complicate control system. From such problems, the Surface Motor (SFM) that can drive two directions by only one motor is researched. Authors have proposed SFM that considered wide range movement and the application to a curved surface. In this paper, the characteristics of the micro step drive by the open loop control are showed. Introduction of closed loop control for highly accurate positioning, moreover, is examined. The drive characteristics by each control are compared.

  7. Role of RBP2-Induced ER and IGF1R-ErbB Signaling in Tamoxifen Resistance in Breast Cancer.

    PubMed

    Choi, Hee-Joo; Joo, Hyeong-Seok; Won, Hee-Young; Min, Kyueng-Whan; Kim, Hyung-Yong; Son, Taekwon; Oh, Young-Ha; Lee, Jeong-Yeon; Kong, Gu

    2018-04-01

    Despite the benefit of endocrine therapy, acquired resistance during or after treatment still remains a major challenge in estrogen receptor (ER)-positive breast cancer. We investigated the potential role of histone demethylase retinoblastoma-binding protein 2 (RBP2) in endocrine therapy resistance of breast cancer. Survival of breast cancer patients according to RBP2 expression was analyzed in three different breast cancer cohorts including METABRIC (n = 1980) and KM plotter (n = 1764). RBP2-mediated tamoxifen resistance was confirmed by invitro sulforhodamine B (SRB) colorimetric, colony-forming assays, and invivo xenograft models (n = 8 per group). RNA-seq analysis and receptor tyrosine kinase assay were performed to identify the tamoxifen resistance mechanism by RBP2. All statistical tests were two-sided. RBP2 was associated with poor prognosis to tamoxifen therapy in ER-positive breast cancer (P = .04 in HYU cohort, P = .02 in KM plotter, P = .007 in METABRIC, log-rank test). Furthermore, RBP2 expression was elevated in patients with tamoxifen-resistant breast cancer (P = .04, chi-square test). Knockdown of RBP2 conferred tamoxifen sensitivity, whereas overexpression of RBP2 induced tamoxifen resistance invitro and invivo (MCF7 xenograft: tamoxifen-treated control, mean [SD] tumor volume = 70.8 [27.9] mm3, vs tamoxifen-treated RBP2, mean [SD] tumor volume = 387.9 [85.1] mm3, P < .001). Mechanistically, RBP2 cooperated with ER co-activators and corepressors and regulated several tamoxifen resistance-associated genes, including NRIP1, CCND1, and IGFBP4 and IGFBP5. Furthermore, epigenetic silencing of IGFBP4/5 by RBP2-ER-NRIP1-HDAC1 complex led to insulin-like growth factor-1 receptor (IGF1R) activation. RBP2 also increased IGF1R-ErbB crosstalk and subsequent PI3K-AKT activation via demethylase activity-independent ErbB protein stabilization. Combinational treatment with tamoxifen and PI3K inhibitor could overcome RBP2-mediated tamoxifen resistance (RBP2-overexpressing cells: % cell viability [SD], tamoxifen = 89.0 [3.8]%, vs tamoxifen with BKM120 = 41.3 [5.6]%, P < .001). RBP2 activates ER-IGF1R-ErbB signaling cascade in multiple ways to induce tamoxifen resistance, suggesting that RBP2 is a potential therapeutic target for ER-driven cancer. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. Tools for educational access to seismic data

    NASA Astrophysics Data System (ADS)

    Taber, J. J.; Welti, R.; Bravo, T. K.; Hubenthal, M.; Frechette, K.

    2017-12-01

    Student engagement can be increased both by providing easy access to real data, and by addressing newsworthy events such as recent large earthquakes. IRIS EPO has a suite of access and visualization tools that can be used for such engagement, including a set of three tools that allow students to explore global seismicity, use seismic data to determine Earth structure, and view and analyze near-real-time ground motion data in the classroom. These tools are linked to online lessons that are designed for use in middle school through introductory undergraduate classes. The IRIS Earthquake Browser allows discovery of key aspects of plate tectonics, earthquake locations (in pseudo 3D) and seismicity rates and patterns. IEB quickly displays up to 20,000 seismic events over up to 30 years, making it one of the most responsive, practical ways to visualize historical seismicity in a browser. Maps are bookmarkable and preserve state, meaning IEB map links can be shared or worked into a lesson plan. The Global Seismogram Plotter automatically creates visually clear seismic record sections from selected large earthquakes that are tablet-friendly and can also to be printed for use in a classroom without computers. The plots are designed to be appropriate for use with no parameters to set, but users can also modify the plots, such as including a recording station near a chosen location. A guided exercise is provided where students use the record section to discover the diameter of Earth's outer core. Students can pick and compare phase arrival times onscreen which is key to performing the exercise. A companion station map shows station locations and further information and is linked to the record section. jAmaSeis displays seismic data in real-time from either a local instrument and/or from remote seismic stations that stream data using standard seismic data protocols, and can be used in the classroom or as a public display. Users can filter data, fit a seismogram to travel time curves, triangulate event epicenters on a globe, estimate event magnitudes, and generate images showing seismograms and corresponding calculations. All three tools access seismic databases curated by IRIS Data Services. In addition, jAmaseis also can access data from non-IRIS sources.

  9. Observing campaign on 5 variables in Cygnus

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2015-10-01

    Dr. George Wallerstein (University of Washington) has requested AAVSO assistance in monitoring 5 variable stars in Cygnus now through December 2015. He is working to complete the radial velocity curves for these stars, and needs optical light curves for correlation with the spectra he will be obtaining. Wallerstein writes: "I need to know the time of max or min so I can assign a phase to each spectrum. Most classical Cepheids are quite regular so once a time of max or min can be established I can derive the phase of each observation even if my obs are several cycles away from the established max or min. MZ Cyg is a type II Cepheid and they are less regular than their type I cousins." SZ Cyg, X Cyg, VX Cyg, and TX Cyg are all classical Cepheids. V and visual observations are requested. These are long-period Cepheids, so nightly observations are sufficient. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  10. IM Nor monitoring requested for HST COS observations

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-02-01

    Dr. Ed Sion (Villanova University) and colleagues have requested AAVSO observers' assistance in monitoring the symbiotic-type recurrent nova IM Nor in support of observations with the Hubble Space Telescope Cosmic Origins Spectrograph scheduled for 2017 February 13 - 17 UT. These observations are part of a study on short orbital period recurrent novae as Supernovae Type Ia progenitors. It is essential to know 24 hours prior to the HST COS observations that IM Nor is not in outburst, in order to protect the instrumentation. Also, photometry is needed throughout the HST window to insure knowledge of the brightness of the system. Observers are asked to monitor IM Nor with nightly snapshot images (V preferred) from now through February 20, and to report their observations promptly. It will be especially important to know the brightness of IM Nor each night through February 17 UT. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  11. V5588 SGR = Nova Sagittarii 2011 No. 2 = Pnv J18102135-2305306

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2011-04-01

    Announces the discovery of Nova Sgr 2011 No. 2 = V5588 SGR = PNV J18102135-2305306 by Koichi Nishiyama (Kurume, Japan) and Fujio Kabashima (Miyaki, Japan) on ~ 2011 March 27.832 UT at unfiltered CCD magnitude mag 11.7. Spectra obtained by A. Arai, M. Nagashima, T. Kajikawa, and C. Naka (Koyama Astronomical Observatory, Kyoto Sangyo University) on Mar. 28.725 UT suggest that the object is a classical nova reddened by interstellar matter. The object was designated PNV J18102135-2305306 when posted on the Central Bureau's Transient Objects Confirmation Page (TOCP) webpage. E. Kazarovets, on behalf of the GCVS team, reports that the name V5588 Sgr has been assigned to this nova. It was nitially announced in CBET 2679 (Daniel W. E. Green, ed.) and AAVSO Special Notice #237 (Waagen). Additional information published in IAU Circular 9203 (Green, ed.). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and observations.

  12. MALAT1 affects ovarian cancer cell behavior and patient survival

    PubMed Central

    Lin, Qunbo; Guan, Wencai; Ren, Weimin; Zhang, Lingyun; Zhang, Jinguo; Xu, Guoxiong

    2018-01-01

    Epithelial ovarian cancer (EOC) is one of the most lethal malignancies of the female reproductive organs. Increasing evidence has revealed that long non-coding RNAs (lncRNAs) participate in tumorigenesis. Metastasis associated lung adenocarcinoma transcript 1 (MALAT1) is an lncRNA and plays a role in various types of tumors. However, the function of MALAT1 on cellular behavior in EOC remains unclear. The current study explored the expression of MALAT1 in ovarian cancer tissues and in EOC cell lines. Quantitative RT-PCR analysis revealed that the expression of MALAT1 was higher in human ovarian malignant tumor tissues and EOC cells than in normal ovarian tissues and non-tumorous human ovarian surface epithelial cells, respectively. By analyzing the online database Kaplan-Meier Plotter, MALAT1 was identified to be correlated with the overall survival (OS) and progression-free survival (PFS) of patients with ovarian cancer. Furthermore, knockdown of MALAT1 by small interfering RNA (siRNA) significantly decreased EOC cell viability, migration, and invasion. Finally, dual-luciferase reporter assays demonstrated that MALAT1 interacted with miR-143-3p, a miRNA that plays a role in EOC as demonstrated in our previous study. Inhibition of MALAT1 resulted in an increase of miR-143-3p expression, leading to a decrease of CMPK protein expression. In conclusion, our results indicated that MALAT1 was overexpressed in EOC. Silencing of MALAT1 decreased EOC cell viability and inhibited EOC cell migration and invasion. These data revealed that MALAT1 may serve as a new therapeutic target of human EOC. PMID:29693187

  13. US Topo: topographic maps for the nation

    USGS Publications Warehouse

    Carswell, William J.

    2013-01-01

    US Topo is the next generation of topographic maps from the U.S. Geological Survey (USGS). Arranged in the familiar 7.5-minute quadrangle format, digital US Topo maps are designed to look and feel (and perform) like the traditional paper topographic maps for which the USGS is so well known. In contrast to paper-based maps, US Topo maps provide modern technical advantages that support faster, wider public distribution and enable basic, on-screen geographic analysis for all users. The US Topo quadrangle map has been redesigned so that map elements are visually distinguishable with the imagery turned on and off, while keeping the file size as small as possible. The US Topo map redesign includes improvements to various display factors, including symbol definitions (color, line thickness, line symbology, area fills), layer order, and annotation fonts. New features for 2013 include the following: a raster shaded relief layer, military boundaries, cemeteries and post offices, and a US Topo cartographic symbols legend as an attachment. US Topo quadrangle maps are available free on the Web. Each map quadrangle is constructed in GeoPDF® format using key layers of geographic data (orthoimagery, roads, geographic names, topographic contours, and hydrographic features) from The National Map databases. US Topo quadrangle maps can be printed from personal computers or plotters as complete, full-sized, maps or in customized sections, in a user-desired specific format. Paper copies of the maps can also be purchased from the USGS Store. Download links and a users guide are featured on the US Topo Web site. US Topo users can turn geographic data layers on and off as needed; they can zoom in and out to highlight specific features or see a broader area. File size for each digital 7.5-minute quadrangle, about 30 megabytes. Associated electronic tools for geographic analysis are available free for download. The US Topo provides the Nation with a topographic product that users can quickly incorporate into decisionmaking, operational or recreational activities.

  14. Sine Oculis Homeobox Homolog 1 Regulates Mitochondrial Apoptosis Pathway Via Caspase-7 In Gastric Cancer Cells.

    PubMed

    Du, Peizhun; Zhao, Jing; Wang, Jing; Liu, Yongchao; Ren, Hong; Patel, Rajan; Hu, Cheng'en; Zhang, Wenhong; Huang, Guangjian

    2017-01-01

    Sine oculis homeobox homolog 1 (Six1) is crucial in normal organ development. Recently, Six1 is reported to display aberrant expression in various cancers and plays important roles in cancer development. However, the regulatory mechanism of Six1 in gastric cancer is largely unknown. In the current study, we found that Six1 was increased in gastric cancer tissues, and its upregulation significantly associated with lymph node metastasis (p=0.042) and poor differentiation (p=0.039). Next, we took advantage of public available microarray data to assess Six1 prognostic value with online K-M Plotter software in gastric cancer, which demonstrated that patients with higher Six1 expression had shorter survival time (p=0.02). To explore the underlying mechanism of Six1, we silenced its upregulation in gastric cells to detect cellular functions. Our results indicated that knock-down Six1 could decrease colony formation number and rendered cells sensitive to 5- Fluorouracil drug treatment. The flow cytometry analyses showed that Six1 silence could promote apoptosis but had little effect on cell cycle transition. Along this clue, we tested mitochondrial membrane potential with JC-1 assay, which suggested that Six1 inhibition could trigger mitochondrial apoptosis. Our subsequent results revealed that Six1 knock-down could reduce the level of anti-apoptotic protein Bcl-2, and caspase-7 but not caspase-3 was involved to execute the mitochondrial apoptosis pathway. Taken together, we find Six1 has oncogenic role in gastric cancer development, and silenced Six1 expression can promote mitochondrial apoptosis by repressing Bcl-2 and activating executor caspase-7. These findings suggest that Six1 may become a valuable prognostic and therapeutic target in gastric cancer.

  15. Sine Oculis Homeobox Homolog 1 Regulates Mitochondrial Apoptosis Pathway Via Caspase-7 In Gastric Cancer Cells

    PubMed Central

    Du, Peizhun; Zhao, Jing; Wang, Jing; Liu, Yongchao; Ren, Hong; Patel, Rajan; Hu, Cheng'en; Zhang, Wenhong; Huang, Guangjian

    2017-01-01

    Sine oculis homeobox homolog 1 (Six1) is crucial in normal organ development. Recently, Six1 is reported to display aberrant expression in various cancers and plays important roles in cancer development. However, the regulatory mechanism of Six1 in gastric cancer is largely unknown. In the current study, we found that Six1 was increased in gastric cancer tissues, and its upregulation significantly associated with lymph node metastasis (p=0.042) and poor differentiation (p=0.039). Next, we took advantage of public available microarray data to assess Six1 prognostic value with online K-M Plotter software in gastric cancer, which demonstrated that patients with higher Six1 expression had shorter survival time (p=0.02). To explore the underlying mechanism of Six1, we silenced its upregulation in gastric cells to detect cellular functions. Our results indicated that knock-down Six1 could decrease colony formation number and rendered cells sensitive to 5- Fluorouracil drug treatment. The flow cytometry analyses showed that Six1 silence could promote apoptosis but had little effect on cell cycle transition. Along this clue, we tested mitochondrial membrane potential with JC-1 assay, which suggested that Six1 inhibition could trigger mitochondrial apoptosis. Our subsequent results revealed that Six1 knock-down could reduce the level of anti-apoptotic protein Bcl-2, and caspase-7 but not caspase-3 was involved to execute the mitochondrial apoptosis pathway. Taken together, we find Six1 has oncogenic role in gastric cancer development, and silenced Six1 expression can promote mitochondrial apoptosis by repressing Bcl-2 and activating executor caspase-7. These findings suggest that Six1 may become a valuable prognostic and therapeutic target in gastric cancer. PMID:28367243

  16. Monitoring of EPIC 204278916 requested

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-04-01

    Dr. Carlo Manara (ESA Science and Technology SCI-S, the Netherlands) and colleagues have requested AAVSO assistance in monitoring the young, disk-bearing low-mass (M type) pre-main-sequence star EPIC 204278916 (2MASS J16020757-2257467). Dr. Manara reports that this star showed "a very interesting dimming event in August-September 2014 which may be caused by transiting material (exo-comets like) (Scaringi et al., 2016MNRAS.463.2265S, https://ui.adsabs.harvard.edu/#abs/2016MNRAS.tmp.1267S/abstract). It would be very useful to know whether this event has any periodicity in order to constrain the possible scenario...The major dimming [up to 65%] we see is 1.2 mag in V, others are 0.5-0.8 mag" in V. He also notes that "the dimming event we saw lasted for some 25 days, although the most extreme event was 1 day long. Based on the noisy WASP data we have [there are] some suggestions that the event happens every 100 days, but we are not sure about it." Manara requests ongoing monitoring of this system to look for additional dimming events and to observe any that are seen, so that he and his colleagues may determine if periodicity exists in these events and to study its nature. Beginning now and continuing until further notice, nightly observations in V are requested. Weekly observations in B are also requested. If a dimming event occurs, observations in V and B at a higher cadence are requested. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  17. AR Sco observing campaign

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-08-01

    Dr. Thomas Marsh (University of Warwick) and colleagues have requested AAVSO coverage of the intriguing binary AR Sco in support of upcoming Newton-XMM observations scheduled for 2016 September 10 15:41 - September 11 02:26 UT. This fascinating binary system is the subject of an exciting paper in the July 2016 issue of Nature (Marsh et al., 2016Natur.537..374M; pre-print version at arXiv (http://arxiv.org/abs/1607.08265). Marsh writes of their research on AR Sco: "...it was down to [the amateurs [who are co-authors] on the paper that we got onto it in the first place. Coverage immediately before, after and (especially) during [the XMM observations] would be great. The most challenging aspect is the time resolution: ideally one wants a cadence < 29 seconds because of the strong harmonic of the basic 2 minute period, and the faster the better. Observers should use whatever filter (including clear/white light) is needed to allow them to match this constraint. Accurate timing is also essential - the centres of the exposures need to be known to better than ± 2 seconds, and preferably better." A page of materials on AR Sco related to the Nature paper may be found at http://deneb.astro.warwick.ac.uk/phsaap/arsco-info/ . Item #9 on that page is a YouTube video of a fascinating movie Dr. Marsh made of AR Sco from their data (https://www.youtube.com/watch?v=QJGAv2jCF4s&feature=youtu.be). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  18. The Mars Analysis Correction Data Assimilation (MACDA): A reference atmospheric reanalysis

    NASA Astrophysics Data System (ADS)

    Montabone, Luca; Lewis, Stephen R.; Steele, Liam J.; Holmes, James; Read, Peter L.; Valeanu, Alexandru; Smith, Michael D.; Kass, David; Kleinboehl, Armin; LMD Team, MGS/TES Team, MRO/MCS Team

    2016-10-01

    The Mars Analysis Correction Data Assimilation (MACDA) dataset version 1.0 contains the reanalysis of fundamental atmospheric and surface variables for the planet Mars covering a period of about three Martian years (late MY 24 to early MY 27). This four-dimensional dataset has been produced by data assimilation of retrieved thermal profiles and column dust optical depths from NASA's Mars Global Surveyor/Thermal Emission Spectrometer (MGS/TES), which have been assimilated into a Mars global climate model (MGCM) using the Analysis Correction scheme developed at the UK Meteorological Office.The MACDA v1.0 reanalysis is publicly available, and the NetCDF files can be downloaded from the archive at the Centre for Environmental Data Analysis/British Atmospheric Data Centre (CEDA/BADC). The variables included in the dataset can be visualised using an ad-hoc graphical user interface (the "MACDA Plotter") located at the following URL: http://macdap.physics.ox.ac.uk/The first paper about MACDA reanalysis of TES retrievals appeared in 2006, although the acronym MACDA was not yet used at that time. Ten years later, MACDA v1.0 has been used by several researchers worldwide and has contributed to the advancement of the knowledge about the martian atmosphere in critical areas such as the radiative impact of water ice clouds, the solsticial pause in baroclinic wave activity, and the climatology and dynamics of polar vortices, to cite only a few. It is therefore timely to review the scientific results obtained by using such Mars reference atmospheric reanalysis, in order to understand what priorities the user community should focus on in the next decade.MACDA is an ongoing collaborative project, and work funded by NASA MDAP Programme is currently undertaken to produce version 2.0 of the Mars atmospheric reanalysis. One of the key improvements is the extension of the reanalysis period to nine martian years (MY 24 through MY 32), with the assimilation of NASA's Mars Reconnaissance Orbiter/Mars Climate Sounder (MRO/MCS) retrievals of thermal and dust opacity profiles. MACDA 2.0 is also going to be based on an improved version of the underlying MGCM and an updated scheme to fully assimilate (radiative active) tracers, such as dust.

  19. Nova Sagittarii 2014 = PNV J18250860-2236024 AND Erratum

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2014-02-01

    Details of discovery of Nova Sagittarii 2014 (PNV J18250860-2236024) and procedures for observing and reporting observations are announced. Discovered by Sigeru Furuyama (Tone-machi, Ibaraki-ken, Japan) andreported by S. Nakano (Sumoto, Japan) at unfiltered CCD magnitude 8.7 on 2014 Jan. 26.857 UT. Coordinates: R.A. 18 25 08.60 Decl. = -22 36 02.4 (2000.0). Nova Sgr 2014 is Fe II-type classical nova past maximum, per low-resolution spectra obtained by A. Arai on 2014 Jan. 30.87 UT. Announced in IAU CBAT CBET 3802 (D. W. E. Green, ed.). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and observations. Also, an Erratum is reported. In AAVSO Alert Notice 496, Mati Morel (MMAT, Thornton, NSW, Australia) was credited with the discovery of the 1989 outburst of V745 Sco. The discoverer was William Liller (LIW, Vina del Mar, Chile), who observed V745 Sco on 1989 July 30.08 UT at magnitude 9.7 (PROBLICOM discovery using 2415 film with orange filter).

  20. Bioinformatics analysis of the prognostic value of Tripartite Motif 28 in breast cancer.

    PubMed

    Hao, Ling; Leng, Jun; Xiao, Ruijing; Kingsley, Tembo; Li, Xinran; Tu, Zhenbo; Yang, Xiangyong; Deng, Xinzhou; Xiong, Meng; Xiong, Jie; Zhang, Qiuping

    2017-04-01

    Tripartite motif containing 28 (TRIM28) is a transcriptional regulator acting as an essential corepressor for Krüppel-associated box zinc finger domain-containing proteins in multiple tissue and cell types. An increasing number of studies have investigated the function of TRIM28; however, its prognostic value in breast cancer (BC) remains unclear. In the present study, the expression of TRIM28 was identified to be significantly higher in cancerous compared with healthy tissue samples. Furthermore, it was demonstrated that TRIM28 expression was significantly correlated with several clinicopathological characteristics of patients with BC, such as p53 mutation, tumor recurrence and Elston grade of the tumor. In addition, a protein-protein interaction network was created to illustrate the interactions of TRIM28 with other proteins. The prognostic value of TRIM28 in patients with BC was investigated using the Kaplan-Meier Plotter database, which revealed that high expression of TRIM28 is a predictor of poor prognosis in patients with BC. In conclusion, the results of the present study indicate that TRIM28 provides a survival advantage to patients with BC and is a novel prognostic biomarker, in addition to being a therapeutic target for the treatment of BC.

  1. CI Aql monitoring needed to support HST observations

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-10-01

    Dr. Edward Sion (Villanova University) has requested AAVSO observers' assistance in monitoring the recurrent nova CI Aql in support of observations with the Hubble Space Telescope Cosmic Origins Spectrograph scheduled for October 31 - November 2, 2016, and November 3 - November 5, 2016. These observations are part of a study on short orbital period recurrent novae as Supernovae Type Ia progenitors. It is essential to know 24 hours prior to the HST COS observations that CI Aql is not in outburst, in order to protect the instrumentation. Observers are asked to keep an eye on CI Aql with nightly snapshot images (V preferred) from now until November 12, and to report their observations promptly. It will be especially important to know the brightness of CI Aql each night for October 28 through November 7 UT. Visual observations are welcome. CI Aql (Nova Aql 1917) has had recurrent outbursts in 1941 and 2000, brightening to V 8.5. At minimum it is V 16-16.5 or fainter. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  2. Supernova 2011fe in M101 (NGC 5457) = PSN J14030581+5416254

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2011-08-01

    The discovery is reported of Supernova 2011fe in NGC 5457 = PSN J14030581+5416254 by the Type Ia supernova science working group of the Palomar Transient Factory, Peter Nugent et al., on 2011 Aug. 24 UT at magnitude 17.2 (g-band, calibrated with respect to the USNO catalog. (Credit for an independent discovery by Mathew Marulla and Tavi Grenier was later rescinded by D. Green, Gentral Bureau for Astronomical Telegrams.) A spectrum obtained on 2011 Aug. 24 UT indicates that SN 2011fe is probably a Type Ia supernova at a very early phase. SN 2011fe was initially announced in ATEL #3581 (Peter Nugent et al.), AAVSO Special Notice #250 (Matthew Templeton), and Central Bureau for Astronomical Telegrams (CBAT) Electronic Telegram 2792 (Daniel W. E. Green, ed.). According to Green, the object was designated PSN J14030581+5416254 when posted on the CBAT Transient Objects Confirmation Page (TOCP) webpage. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details, observations, and links to images.

  3. Nova Sco 2011 No. 2 = PNV J16364440-4132340 = PNV J16364300-4132460

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2011-09-01

    Announcement of discovery of Nova Sco 2011 No. 2 = PNV J16364440-4132340 = PNV J16364300-4132460. Discovered independently by John Seach (Chatsworth Island, NSW, Australia, on 2011 Sep. 06.37 UT at mag=9.8 (DSLR)) and by Yuji Nakamura (Kameyama, Mie, Japan, on 2011 Sep. 06.4313 UT at mag=9.7 C (CCD)). Posted on the IAU Central Bureau for Astronomical Telegrams Transient Object Confirmation Page (TOCP) as PNV J16364440-4132340 (Nakamura) and PNV J16364300-4132460 (Seach); identifications consolidated in VSX under PNV J16364440-4132340. Spectra obtained by A. Arai et al. on 2011 Sep. 7.42 UT suggest a highly reddened Fe II-type classical nova. Spectra by F. Walter and J. Seron obtained Sep. 2011 8.091 UT confirm a young galactic nova; they report spectra are reminiscent of an early recurrent nova. Initially announced in AAVSO Special Notice #251 (Matthew Templeton) and IAU Central Bureau Electronic Telegram 2813 (Daniel W. E. Green, ed.). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and observations.

  4. Computerized Machine for Cutting Space Shuttle Thermal Tiles

    NASA Technical Reports Server (NTRS)

    Ramirez, Luis E.; Reuter, Lisa A.

    2009-01-01

    A report presents the concept of a machine aboard the space shuttle that would cut oversized thermal-tile blanks to precise sizes and shapes needed to replace tiles that were damaged or lost during ascent to orbit. The machine would include a computer-controlled jigsaw enclosed in a clear acrylic shell that would prevent escape of cutting debris. A vacuum motor would collect the debris into a reservoir and would hold a tile blank securely in place. A database stored in the computer would contain the unique shape and dimensions of every tile. Once a broken or missing tile was identified, its identification number would be entered into the computer, wherein the cutting pattern associated with that number would be retrieved from the database. A tile blank would be locked into a crib in the machine, the shell would be closed (proximity sensors would prevent activation of the machine while the shell was open), and a "cut" command would be sent from the computer. A blade would be moved around the crib like a plotter, cutting the tile to the required size and shape. Once the tile was cut, an astronaut would take a space walk for installation.

  5. FLOWCHART; a computer program for plotting flowcharts

    USGS Publications Warehouse

    Bender, Bernice

    1982-01-01

    The computer program FLOWCHART can be used to very quickly and easily produce flowcharts of high quality for publication. FLOWCHART centers each element or block of text that it processes on one of a set of (imaginary) vertical lines. It can enclose a text block in a rectangle, circle or other selected figure. It can draw a 'line connecting the midpoint of any side of any figure with the midpoint of any side of any other figure and insert an arrow pointing in the direction of flow. It can write 'yes' or 'no' next to the line joining two figures. FLOWCHART creates flowcharts using some basic plotting subroutine* which permit plots to be generated interactively and inspected on a Tektronix compatible graphics screen or plotted in a deferred mode on a Houston Instruments 42' pen plotter. The size of the plot, character set and character height in inches are inputs to the program. Plots generated using the pen plotter can be up to 42' high--the larger size plots being directly usable as visual aids in a talk. FLOWCHART centers each block of text on an imaginary column line. (The number of columns and column width are specified as input.) The midpoint of the longest line of text within the block is defined to be the center of the block and is placed on the column line. The spacing of individual words within the block is not altered when the block is positioned. The program writes the first block of text in a designated column and continues placing each subsequent block below the previous block in the same column. A block of text may be placed in a different column by specifying the number of the column and an earlier block of text with which the new block is to be aligned. If block zero is given as the earlier block, the new text is placed in the new column continuing down the page below the previous block. Optionally a column and number of inches from the top of the page may be given for positioning the next block of text. The program will normally draw one of five types of figure to enclose a block of text: a rectangle, circle, diamond, eight sided figure or figure with parallel sides and rounded ends. It can connect the figure with a line to the preceding figure, and place an arrow pointing toward the second figure. Text blocks not in sequence can also be connected and 'yes' or 'no' written next to any line to indicate branching. Figure 1 illustrates the various types of figures that can be drawn, spacings, connecting lines and the like. * The plotting package employed is Buplot available on the VAX and PDP-1170 computers at the USGS Office of Earthquake Studies, Golden, Colo. Calls to the plotting subroutines must be adjusted if some other plotting package is used.

  6. x-y-recording in transmission electron microscopy. A versatile and inexpensive interface to personal computers with application to stereology.

    PubMed

    Rickmann, M; Siklós, L; Joó, F; Wolff, J R

    1990-09-01

    An interface for IBM XT/AT-compatible computers is described which has been designed to read the actual specimen stage position of electron microscopes. The complete system consists of (i) optical incremental encoders attached to the x- and y-stage drivers of the microscope, (ii) two keypads for operator input, (iii) an interface card fitted to the bus of the personal computer, (iv) a standard configuration IBM XT (or compatible) personal computer optionally equipped with a (v) HP Graphic Language controllable colour plotter. The small size of the encoders and their connection to the stage drivers by simple ribbed belts allows an easy adaptation of the system to most electron microscopes. Operation of the interface card itself is supported by any high-level language available for personal computers. By the modular concept of these languages, the system can be customized to various applications, and no computer expertise is needed for actual operation. The present configuration offers an inexpensive attachment, which covers a wide range of applications from a simple notebook to high-resolution (200-nm) mapping of tissue. Since section coordinates can be processed in real-time, stereological estimations can be derived directly "on microscope". This is exemplified by an application in which particle numbers were determined by the disector method.

  7. Evaluation of the TRPM2 channel as a biomarker in breast cancer using public databases analysis.

    PubMed

    Sumoza-Toledo, Adriana; Espinoza-Gabriel, Mario Iván; Montiel-Condado, Dvorak

    Breast cancer is one of the most common malignancies affecting women. Recent investigations have revealed a major role of ion channels in cancer. The transient receptor potential melastatin-2 (TRPM2) is a plasma membrane and lysosomal channel with important roles in cell migration and cell death in immune cells and tumor cells. In this study, we investigated the prognostic value of TRPM2 channel in breast cancer, analyzing public databases compiled in Oncomine™ (Thermo Fisher, Ann Arbor, MI) and online Kaplan-Meier Plotter platforms. The results revealed that TRPM2 mRNA overexpression is significant in situ and invasive breast carcinoma compared to normal breast tissue. Furthermore, multi-gene validation using Oncomine™ showed that this channel is coexpressed with proteins related to cellular migration, transformation, and apoptosis. On the other hand, Kaplan-Meier analysis exhibited that low expression of TRPM2 could be used to predict poor outcome in ER- and HER2+ breast carcinoma patients. TRPM2 is a promising biomarker for aggressiveness of breast cancer, and a potential target for the development of new therapies. Copyright © 2016 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.

  8. LD2SNPing: linkage disequilibrium plotter and RFLP enzyme mining for tag SNPs

    PubMed Central

    Chang, Hsueh-Wei; Chuang, Li-Yeh; Chang, Yan-Jhu; Cheng, Yu-Huei; Hung, Yu-Chen; Chen, Hsiang-Chi; Yang, Cheng-Hong

    2009-01-01

    Background Linkage disequilibrium (LD) mapping is commonly used to evaluate markers for genome-wide association studies. Most types of LD software focus strictly on LD analysis and visualization, but lack supporting services for genotyping. Results We developed a freeware called LD2SNPing, which provides a complete package of mining tools for genotyping and LD analysis environments. The software provides SNP ID- and gene-centric online retrievals for SNP information and tag SNP selection from dbSNP/NCBI and HapMap, respectively. Restriction fragment length polymorphism (RFLP) enzyme information for SNP genotype is available to all SNP IDs and tag SNPs. Single and multiple SNP inputs are possible in order to perform LD analysis by online retrieval from HapMap and NCBI. An LD statistics section provides D, D', r2, δQ, ρ, and the P values of the Hardy-Weinberg Equilibrium for each SNP marker, and Chi-square and likelihood-ratio tests for the pair-wise association of two SNPs in LD calculation. Finally, 2D and 3D plots, as well as plain-text output of the results, can be selected. Conclusion LD2SNPing thus provides a novel visualization environment for multiple SNP input, which facilitates SNP association studies. The software, user manual, and tutorial are freely available at . PMID:19500380

  9. Fiberoptic Applications in Sensors and Telemetry for the Electric Power Industry

    NASA Astrophysics Data System (ADS)

    Werneck, M. M.; Silva, A. V.; Souza, N. C. C.; Miguel, M. A. L.; Beres, C.; Yugue, E. S.; Carvalho, C. C.; Maciel, F. L.; Silva-Neto, J.; Guimarães, C. R. F.; Allil, R. C. S. B.; Baliosian, J. A. G.

    2008-10-01

    This presentation features the origin and the work of the Photonics and Instrumentation Laboratory (LIF) in instrumentation, fiberoptic sensors and POF technology. LIF started its work in 1986, twenty and two years ago, with only one lecturer and a few students. The first project was the development of the first Brazilian plotter with the purpose, at the time, to substitute expensive imported technology. LIF has today 25 people between students, technicians, scientists, engineers and teachers. We present here several successful projects of fiberoptic sensors using both silica and POF fibers, most of them applied on the field mainly for the electric power industry. Described are: a oil leakage sensor in petroleum hoses, PMMA evanescent sensors, temperature by the ruby fluorescence phenomenon, a current sensor calibrator for 500 kV current transformers, a leakage sensor to measure 500 kV insulators in extra-high voltage transmission line, etc. Many of the sensors presented here have been tested in the field, patented and transferred to the industry. We have also technical collaboration with several industries in Brazil, one of them a spin-off from LIF. Our objective is to become a reference centre in POF technology in Latin America and for this we are intended to keep producing "out of the shelves" POF technology and innovative industry solutions for many areas.

  10. Supernova 2011at = PSN J09285756-1448206 in MCG -02-24-27

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2011-03-01

    Announces the discovery of SN 2011at = PSN J09285756-1448206 in MCG -02-24-27 by Lou Cox, Jack Newton, and Tim Puckett (Ellijay, GA, in the course of the Puckett Observatory Supernova Search) on 2011 March 10.214 UT at unfiltered CCD magnitude 14.5. Spectra obtained March 11.81 UT with the Swift satellite (+UVOT) by F. Bufano (Istituto Nazionale di Astrofisica (INAF), Osservatorio Astronomico di Catania), S. Benetti (INAF, Osservatorio Astronomico di Padova), and A. Pastorello (Queen's University, Belfast, et al.); and on March 12 UT with the F. L. Whipple Observatory 1.5-m telescope (+FAST) by M. Calkins (reported by G. H. Marion, Harvard-Smithsonian Center for Astrophysics (CfA), on behalf of the CfA Supernova Group) show SN 2011at to be a type-Ia supernova a few days before/around maximum. The object was designated PSN J09285756-1448206 when posted on the Central Bureau's Transient Objects Confirmation Page (TOCP) webpage. Initially announced in CBET 2676 (Daniel W. ! E. Green, ed.). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  11. Purification of microalgae from bacterial contamination using a disposable inertia-based microfluidic device

    NASA Astrophysics Data System (ADS)

    Godino, Neus; Jorde, Felix; Lawlor, Daryl; Jaeger, Magnus; Duschl, Claus

    2015-08-01

    Microalgae are a promising source of bioactive ingredients for the food, pharmaceutical and cosmetic industries. Every microalgae research group or production facility is facing one major problem regarding the potential contamination of the algal cell with bacteria. Prior to the storage of the microalgae in strain collections or to cultivation in bioreactors, it is necessary to carry out laborious purification procedures to separate the microalgae from the undesired bacterial cells. In this work, we present a disposable microfluidic cartridge for the high-throughput purification of microalgae samples based on inertial microfluidics. Some of the most relevant microalgae strains have a larger size than the relatively small, few micron bacterial cells, so making them distinguishable by size. The inertial microfluidic cartridge was fabricated with inexpensive materials, like pressure sensitive adhesive (PSA) and thin plastic layers, which were patterned using a simple cutting plotter. In spite of fabrication restrictions and the intrinsic difficulties of biological samples, the separation of microalgae from bacteria reached values in excess of 99%, previously only achieved using conventional high-end and high cost lithography methods. Moreover, due to the simple and high-throughput characteristic of the separation, it is possible to concatenate serial purification to exponentially decrease the absolute amount of bacteria in the final purified sample.

  12. A model for simulation of flow in singular and interconnected channels

    USGS Publications Warehouse

    Schaffranek, Raymond W.; Baltzer, R.A.; Goldberg, D.E.

    1981-01-01

    A one-dimensional numerical model is presented for simulating the unsteady flow in singular riverine or estuarine reaches and in networks of reaches composed of interconnected channels. The model is both general and flexible in that it can be used to simulate a wide range of flow conditions for various channel configurations. The channel geometry of the network to be modeled should be sufficiently simple so as to lend itself to characterization in one spatial dimension. The flow must be substantially homogenous in density, and hydrostatic pressure must prevail everywhere in the network channels. The slope of each channel bottom ought to be mild and reasonably constant over its length so that the flow remains subcritical. The model accommodates tributary inflows and diversions and includes the effects of wind shear on the water surface as a forcing function in the flow equations. Water-surface elevations and flow discharges are computed at channel junctions, as well as at specified intermediate locations within the network channels. The one-dimensional branch-network flow model uses a four-point, implicit, finite-difference approximation of the unsteady-flow equations. The flow equations are linearized over a time step, and branch transformations are formulated that describe the relationship between the unknowns at the end points of the channels. The resultant matrix of branch-transformation equations and required boundary-condition equations is solved by Gaussian elimination using maximum pivot strategy. Five example applications of the flow model are illustrated. The applications cover such diverse conditions as a singular upland river reach in which unsteady flow results from hydropower regulations, coastal rivers composed of sequentially connected reaches subject to unsteady tide-driven flow, and a multiply connected network of channels whose flow is principally governed by wind tides and seiches in adjoining lakes. The report includes a listing of the FORTRAN IV computer program and a description of the input data requirements. Model supporting programs for the processing and input of initial and boundary-value data are identified, various model output formats are illustrated, and instructions are given to permit the production of graphical output using the line printer, electromechanical pen plotters, cathode-ray-tube display units, or microfilm recorders.

  13. Lessons From the Largest Historic Floods Documented by the U.S. Geological Survey

    NASA Astrophysics Data System (ADS)

    Costa, J. E.

    2003-12-01

    A recent controversy over the flood risk downstream from a USGS streamgaging station in southern California that recorded a large debris flow led to the decision to closely examine a sample of the largest floods documented in the US. Twenty-nine floods that define the envelope curve of the largest rainfall-runoff floods were examined in detail, including field visits. These floods have a profound impact on local, regional, and national interpretations of potential peak discharges and flood risk. These 29 floods occured throughout the US from the northern Chesapeake Bay in Maryland to Kauai, Hawaii, and over time from 1935-1978. Methods used to compute peak discharges were slope-area (21/29), culvert computations (2/29), measurements lost or not available for study (2/29), bridge contraction, culvert flow, and flow over road (1/29), rating curve extension (1/29), current meter measurement (1/29), and rating curve and current meter measurement (1/29). While field methods and tools have improved significantly over the last 70 years (e.g. total stations, GPS, GIS, hydroacoustics, digital plotters and computer programs like SAC and CAP), the primary methods of hydraulic analysis for indirect measurements of outstanding floods has not changed: today flow is still assumed to be 1-D and gradually varied. Unsteady or multi-dimensional flow models are rarely if ever used to determine peak discharges. Problems identified in this sample of 29 floods include debris flows misidentified as water floods, small drainage areas determined from small-scale maps and mislocated sites, high-water marks set by transient hydraulic phenomena, possibility of disconnected flow surfaces, scour assumptions in sand channels, poor site selection, incorrect approach angle for road overflow, and missing or lost records. Each published flood magnitude was checked by applying modern computer models with original field data, or by re-calculating computations. Four of 29 floods in this sample were found to have errors resulting in a change of the peak discharge of more than 10%.

  14. Beta Pic observations requested for BRITE-Constellation

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-01-01

    The AAVSO is part of the BRITE-Constellation Ground Based Observations Team (GBOT), supporting cutting-edge science from the BRITE-Constellation satellites and coordinating with BRITE-Constellation scientist Dr. Konstanze Zwintz (Universitaet Innsbruck) and her team. The delta Scuti star beta Pic (NSV 16683) (3.80-3.86V) is one of the BRITE stars being focused on during this season. Bet Pic is particularly interesting now because a transit of the star's planet's Hill sphere (the region around a planet in which it dominates the attraction of satellites) is predicted to occur during 2017-2018. Ongoing observations beginning now are valuable to establish a baseline prior to the transit. The AAVSO's webpage on the BRITE target stars was updated in November with information on bet Pic from Dr. Zwintz. AAVSO observers with appropriate equipment and located at a southern enough latitude are encouraged to observe bet Pic. Its brightness makes bet Pic well suited to PEP and DSLR photometry; CCD photometry is also possible. However, great care must be taken by all observers, especially those using CCD, to avoid saturation. As the amplitude of this star is very small, visual observations are very difficult, but they are welcome. Multicolor (BVR) photometry better than 0.01 magnitude and time-series observations with a cadence of a few minutes (less than 10 minutes) are requested beginning now and continuing at least through 2017 and likely through 2018. The precision and cadence required are essential in order for the data to be most useful for studying the transit. Spectroscopists wishing to participate should submit their spectra directly to Dr. Konstanze Zwintz (konstanze.zwintz@uibk.ac.at). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  15. [THE CONDITION AND TENDENCIES OF DEVELOPMENT OF CLINICAL AND SANITARY MICROBIOLOGY IN THE RUSSIAN FEDERATION AND PROBLEM OF IMPORT SUBSTITUTION].

    PubMed

    Dyatlov, I A; Mironov, A Yu; Shepelin, A P; Aleshkin, V A

    2015-08-01

    The import substitution becomes one of the strategic tasks of national economy as a result of prolongation of economic sanctions concerning the Russian Federation of part of the USA, EU countries, Japan and number of other countries. It is not proper to be limited in import substitution only by goods because in conditions ofsanctions when access toforeign technologies is complicated Russia is needed to substitute foreign technologies by national designs in faster manner One of directions of effective import substitution is localization of production of laboratory equipment and consumables for clinical and sanitary microbiology on the territory ofthe Russian Federation and countries of Customs union. In Russia, in the field ofdiagnostic of dangerous and socially significant infections, all components for import substitution to implement gene diagnostic, immune diagnostic. bio-sensory and biochip approaches, isolation and storage of live microbial cultures, implementation of high-tech methods of diagnostic are available. At the same time, national diagnostic instrument-making industry for microbiology is factually absent. The few devices of national production more than on 50% consist of import components. The microbiological laboratories are to be equipped only with import devices of open type for applying national components. The most perspective national designs to be implemented are multiplex polimerase chain reaction test-systems and biochips on the basis of national plotters and readers. The modern development of diagnostic equipment and diagnostic instruments requires supplement of national collections of bacterial and viral pathogens and working-through of organizational schemes of supplying collections with strains. The presented data concerning justification of nomenclature of laboratory equipment and consumables permits to satisfy in fill scope the needs of clinical and sanitary microbiology in devices, growth mediums, consumables of national production and to refuse import deliveries without decreasing quality of microbiological analysis. This approach will ensure appropriate response to occurring challenges and new biological dangers and maintenance of biosecurity of the Russian Federation at proper level.

  16. NR TrA (Nova TrA 2008) monitoring in support of XMM observations

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-03-01

    Dr. Fred Walter (Stony Brook University) has requested AAVSO observers' assistance in monitoring NR TrA (Nova TrA 2008) in support of upcoming XMM Newton observations. The XMM observations will take place 2017 March 13 06:21 through March 14 10:34 UT. Walter writes: "NR TrA (Nova TrA 2008) is a compact eclipsing system with a 5.5 hour period. It was a normal Fe II nova that, upon reaching quiescence, took on the appearance of a super-soft source in the optical high state, which suggests an extremely high mass accretion rate. The optical spectrum is dominated by hot permitted lines of O VI, N V, C IV, and He II. Some nova-like variables have similar spectra, though generally without the hot emission lines. Primary eclipse is broad - nearly 40% of the orbit - and deeper at shorter wavelengths, which suggests the eclipse of a hot accretion disk. Primary eclipse depth is about 1 mag at V. There appears to be a shallow secondary eclipse.The primary aim [of the XMM observations] is to detect and characterize the eclipse at X-ray and UV wavelengths. We will obtain low cadence BVRI/JHK observations with SMARTS/Andicam. We request AAVSO support to obtain continuous photometric time series simultaneous with the XMM observation. Any filters are acceptable, but standard Johnson B, V or Cousins R, I are preferred. Clear filters are acceptable. Time resolution better than 5 minutes and uncertainties (outside of eclipse) <0.02 mag are preferred. The best ephemeris I have is: minimum light at JD 55956.822 + 0.219109E. This is based on data from 2013-2015." Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  17. Epsilon Aur monitoring during predicted pulsation phase

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.; Templeton, Matthew R.

    2014-09-01

    Dr. Robert Stencel (University of Denver Astronomy Program) has requested that AAVSO observers monitor epsilon Aurigae from now through the end of the observing season. "Studies of the long-term, out-of-eclipse photometry of this enigmatic binary suggest that intervals of coherent pulsation occur at roughly 1/3 of the 27.1-year orbital period. Kloppenborg, et al. noted that stable variation patterns develop at 3,200-day intervals' implying that 'the next span of dates when such events might happen are circa JD ~2457000 (2014 December)'. "These out-of-eclipse light variations often have amplitudes of ~0.1 magnitude in U, and ~0.05 in V, with characteristic timescales of 60-100 days. The AAVSO light curve data to the present may indicate that this coherent phenomenon has begun, but we encourage renewed efforts by observers...to help deduce whether these events are internal to the F star, or externally-driven by tidal interaction with the companion star." Nightly observations or one observation every few days (CCD/PEP/DSLR, VUBR (amplitude too small for visual)) are requested. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. Epsilon Aur was the subject of major international campaigns and the AAVSO's Citizen Sky project as it went through its 27.1-year eclipse in 2009-2011. Over 700 observers worldwide submitted over 20,000 multicolor observations to the AAVSO International Database for this project. Much information on eps Aur is available from the AAVSO, including material on the Citizen Sky website (http://www.aavso.org/epsilon-aurigae and http://www.citizensky.org/content/star-our-project). The Journal of the AAVSO, Volume 40, No. 2 (2012) was devoted to discussion of and research results from this event. See full Alert Notice for more details and observations.

  18. Request to monitor the CV SDSS161033 (1605-00) for HST observations AND TU Cas comparison stars

    NASA Astrophysics Data System (ADS)

    Price, Aaron

    2005-06-01

    AAVSO Alert Notice 319 contains two topics. First: Dr. Paula Szkody (University of Washington) has requested AAVSO assistance in monitoring the suspected UGWZ dwarf nova SDSS J161033 [V386 Ser] for upcoming HST observations. This campaign is similar to the one recently run on SDSS J2205 and SDSS J013132 (AAVSO Alert Notice 318). HST mission planners need to be absolutely sure that SDSS J161033 is not in outburst immediately prior to the scheduled observation; AAVSO observations will be crucial to carrying out the HST program. Nightly V observations are requested June 24-July 1 UT. We are making an unusual request in that we are asking for the FITS images themselves to be uploaded to the AAVSO's FTP site. Second: AAVSO Alert Notice 318 did not specify which stars on the TU Cas PEP chart should be used as comparison and check stars. Also, there was an error on the chart regarding the location of the "83" comparison star [the chart that is available online reflects a corrected location]. Please use the "89" and the "74" stars as your comparison and check stars, respectively. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  19. Rapid prototyping of microfluidic systems using a PDMS/polymer tape composite.

    PubMed

    Kim, Jungkyu; Surapaneni, Rajesh; Gale, Bruce K

    2009-05-07

    Rapid prototyping of microfluidic systems using a combination of double-sided tape and PDMS (polydimethylsiloxane) is introduced. PDMS is typically difficult to bond using adhesive tapes due to its hydrophobic nature and low surface energy. For this reason, PDMS is not compatible with the xurography method, which uses a knife plotter and various adhesive coated polymer tapes. To solve these problems, a PDMS/tape composite was developed and demonstrated in microfluidic applications. The PDMS/tape composite was created by spinning it to make a thin layer of PDMS over double-sided tape. Then the PDMS/tape composite was patterned to create channels using xurography, and bonded to a PDMS slab. After removing the backing paper from the tape, a complete microfluidic system could be created by placing the construct onto nearly any substrate; including glass, plastic or metal-coated glass/silicon substrates. The bond strength was shown to be sufficient for the pressures that occur in typical microfluidic channels used for chemical or biological analysis. This method was demonstrated in three applications: standard microfluidic channels and reactors, a microfluidic system with an integrated membrane, and an electrochemical biosensor. The PDMS/tape composite rapid prototyping technique provides a fast and cost effective fabrication method and can provide easy integration of microfluidic channels with sensors and other components without the need for a cleanroom facility.

  20. Request to monitor 2035-01 AE Aqr for multiwavelength campaign AND Reminder to monitor HT Cas, Z Cha, and OY Car

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2005-08-01

    AAVSO Alert Notice 326 contains two topics. First: Dr. Christopher Mauche (Lawrence Livermore National Laboratory) has requested our assistance in monitoring the novalike intermediate polar AE Aqr in support of multiwavelength (gamma-ray, X-ray, UV, optical, and radio) observations scheduled for August-September 2005. AAVSO observations, particularly CCD ones, are requested to correlate with these multiwavelength observations; visual observations are also encouraged. Second: as announced in Alert Notice 317, Drs. Christopher Mauche, Peter Wheatley, and Koji Mukai have obtained time on XMM-Newton to observe HT Cas, Z Cha, or OY Car in outburst, and they have requested our assistance in monitoring these stars closely so we can inform them promptly when any of them go into outburst. Very prompt notification is essential because of the time required to trigger the satellite and the shortness of the outbursts of the target stars. Please monitor HT Cas, OY Car, and Z Cha closely from now through at least a month after the last observing window closes, and notify Headquarters immediately if any of the target stars goes into outburst. Both visual and CCD observations are encouraged. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  1. Development and Implementation of Efficiency-Improving Analysis Methods for the SAGE III on ISS Thermal Model Originating

    NASA Technical Reports Server (NTRS)

    Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy

    2013-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.

  2. Electromagnetic Surveying in the Mangrove Lakes Region of Everglades National Park

    NASA Astrophysics Data System (ADS)

    Whitman, D.; Price, R.; Frankovich, T.; Fourqurean, J.

    2015-12-01

    The Mangrove Lakes are an interconnected set of shallow (~ 1m), brackish lake and creek systems on the southern margin of the Everglades adjacent to Florida Bay. Current efforts associated with the Comprehensive Everglades Restoration Plan (CERP) aim to increase freshwater flow into this region. This study describes preliminary results of geophysical surveys in the lakes conducted to assess changes in the groundwater chemistry as part of a larger hydrologic and geochemical study in the Everglades Lakes region. Marine geophysical profiles were conducted in Alligator Creek (West Lake) and McCormick Creek systems in May, 2014. Data included marine electromagnetic (EM) profiles and soundings, water depth measurements, surface water conductivity and salinity measurements. A GSSI Profiler EMP-400 multi-frequency EM conductivity meter continuously recorded in-phase and quadrature field components at 1, 8, and 15 KHz. The system was deployed in a flat bottomed plastic kayak towed behind a motorized skiff. Lake water depths were continuously measured with a sounder/chart plotter which was calibrated with periodic sounding rod measurements. At periodic intervals during the survey, the profiling was stopped and surface water conductivity, temperature and salinity are recorded with a portable YSI probe on the tow boat. Over 40,000 discrete 3-frequency EM measurements were collected. The data were inverted to 2-layer models representing the water layer thickness and conductivity and the lake bottom conductivity. At spot locations, models were constrained with water depth soundings and surface water conductivity measurements. At other locations along the profiles, the water depth and conductivity were allowed to be free, but the free models were generally consistent with the constrained models. Multilayer sub-bottom models were also explored but were found to be poorly constrained. In West Lake, sub-bottom conductivities decreased from 400 mS/m in the west to 200 mS/m in the east indicating a general W to E decrease in groundwater salinity. In the McCormick Creek system, sub-bottom conductivities increased from 200 mS/m at the north end of Seven Palm Lake to over 650 mS/m at the southern end of Monroe Lake indicating a general N to S increase in ground water salinity. Additional profiles are planned in August, 2015.

  3. Very rare outburst of the symbiotic variable AG Peg

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2015-06-01

    The symbiotic variable AG Peg is in outburst, the first one observed since its only known outburst, which occurred in 1860-1870. Currently at visual/V magnitude 7.2 (B=7.8), it is an excellent target for visual, PEP, CCD, and DSLR observers and spectroscopists. The current outburst began after 2015 May 27 UT (T. Markham, Leek, Staffordshire, England, from the BAAVSS online database) and was underway by June 13.90 (A. Kosa-Kiss, Salonta, Romania). AG Peg has a very interesting history. Regarding the 1860-1870 outburst, data collected by E. Zinner (Merrill, 1959, S&T, 18, 9, 490) show AG Peg slowly brightening from visual magnitude 9.2 in 1821 to 8.0 in 1855, then at 6.2 in 1860 and brightening to 6.0 in 1870, then in decline at 6.8 by 1903, and continuing to decline slowly ( 6.9 in 1907, 8.0 in 1920, 8.3 in 1940). Observations in the AAVSO International Database since July 1941 show that the decline has continued without interruption from an average magnitude of 7.7 to an average magnitude of 8.8-9.0 by mid-January 2015. The AAVSO data since 1941 also show the periodic 0.4-magnitude variations ( 825 days) that have been present since the 1920s. Thus, after taking about 10 years to brighten from its minimum magnitude of about 9 to its maximum magnitude of 6.0, and then fading gradually over 140-145 years, AG Peg is now in outburst again. There are no observations of the 1860-1870 outburst that show the outburst's beginning. This time, however, in 2015, the opportunity is here to follow the outburst itself closely and learn just what this system does during outburst. Observations in all bands and visual observations are strongly encouraged. AG Peg is bright enough to be a very good PEP target. For spectroscopists, AG Peg has an extremely complex spectrum that undergoes substantial changes and would make a very interesting target. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. Precise observing instructions and other details are given in the full Alert Notice.

  4. Regional Ocean Products Portal: Transforming Information to Knowledge

    NASA Astrophysics Data System (ADS)

    Howard, M. K.; Kobara, S.; Gayanilo, F. C.; Baum, S. K.; Simoniello, C.; Jochens, A. E.

    2010-12-01

    Scientific visualization of complex fusions of heterogeneous 2, 3, and 4-D data sets is a challenge in most fields of geosciences and oceanography is no exception. Despite increased computing power, dedicated graphic processing units, and more capable software, 30 years of change in the ways that geophysical sciences are conducted continues to challenge our ability to present the data in visually meaningful ways. Oceanography, for example, changed from a science in which a sole researcher studied a single phenomena, e.g. ocean currents to one in which a multidisciplinary collaborative teams study complex coupled systems. In three decades we’ve moved from a time where a map of mean circulation and a coastline rendered on a pen-plotter would suffice, to one in which we require detailed dynamic views of relationships and change. We now need to visualize multiple parameters of relatively sparse observed data combined with computer generated output on dense numerical model grids. We want parameters within ocean and atmosphere volumes rendered over detailed earth terrains with illumination and infrastructure. We want to “see” the dynamic relations between the oceans, atmosphere, land, biogeochemistry, biota, and ecosystem all at once and in context. As the computational power increased, the density of the model grid points increased accordingly. The latest challenge has been due to the internet, the notion of sensor webs, and the near real-time availability of high-bandwidth interoperable standards-based data streams. Not only do we want to see it all, we want to see it now, and we want to see it the way we want and that may change from moment to moment. Increasingly this involves 4D visualizations combined with a strong element of traditional Geographic Information System type presentation. The Gulf of Mexico Coastal Ocean Observing System Regional Association (GCOOS-RA) is one of 11 regional observing systems that comprise the non-federal part of the U.S. Integrated Ocean Observing System (IOOS). With IOOS guidance, and cooperation of regional data providers, GCOOS-RA has established a regional interoperable system of systems which has the potential to deliver marine, and coastal marine oceanographic, atmospheric, biogeochemical, and ecosystem related data in an automated and largely unattended way from sensors to products. GCOOS-RA devotes 10% of it’s funding to Education and Outreach activities and we have a number of modeling partners producing terabytes of output. With the interoperable parts of the data delivery system complete, our current challenge has been producing automated workflows that generate useful interactive graphical representations over the web. We have used a variety of commercial and free software packages. Some are net-enabled and can acquire remote datasets. Several are designed for 3D including ITTVIS IDL, Unidata IDV, and IVS’s Fledermaus. This talk will present a survey of software packages we’ve used, our successes and remaining challenges.

  5. Target gene screening and evaluation of prognostic values in non-small cell lung cancers by bioinformatics analysis.

    PubMed

    Piao, Junjie; Sun, Jie; Yang, Yang; Jin, Tiefeng; Chen, Liyan; Lin, Zhenhua

    2018-03-20

    Non-small cell lung cancer (NSCLC) is the major leading cause of cancer-related deaths worldwide. This study aims to explore molecular mechanism of NSCLC. Microarray dataset was obtained from the Gene Expression Omnibus (GEO) database, and analyzed by using GEO2R. Functional and pathway enrichment analysis were performed based on Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) database. Then, STRING, Cytoscape and MCODE were applied to construct the Protein-protein interaction (PPI) network and screen hub genes. Following, overall survival (OS) analysis of hub genes was performed by using the Kaplan-Meier plotter online tool. Moreover, miRecords was also applied to predict the targets of the differentially expressed microRNAs (DEMs). A total of 228 DEGs were identified, and they were mainly enriched in the terms of cell adhesion molecules, leukocyte transendothelial migration and ECM-receptor interaction. A PPI network was constructed, and 16 hub genes were identified, including TEK, ANGPT1, MMP9, VWF, CDH5, EDN1, ESAM, CCNE1, CDC45, PRC1, CCNB2, AURKA, MELK, CDC20, TOP2A and PTTG1. Among the genes, expressions of 14 hub genes were associated with prognosis of NSCLC patients. Additionally, a total of 11 DEMs were also identified. Our results provide some potential underlying biomarkers for NSCLC. Further studies are required to elucidate the pathogenesis of NSCLC. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. ASASSN-17fp rebrightening event and ongoing monitoring

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-05-01

    ASASSN-17fp, discovered on 2017 April 28 and classified as a helium dwarf nova, was observed to be in outburst again on May 16 after fading 2.5 magnitudes from its original outburst. Dr. Tom Marsh (University of Warwick) and Dr. Elme Breedt (University of Cambridge) requested immediate time-series coverage. Dr. Breedt wrote: "The transient was identified as a helium dwarf nova (also known as an AMCVn star) from a spectrum taken by the PESSTO survey and reported in ATel #10334. Since then, we have been observing the target using the New Technology Telescope on La Silla in Chile. We measured a photometric period of 51 minutes in the first few nights during which the object was bright at g=16.03 (Marsh et al., ATel #10354), and then it faded to about g 18. However last night [ May 16] it brightened back to g 16 again, apparently starting a second outburst. Time series observations during this bright state would be very valuable to determine whether the 51 min period we saw in earlier data returns, and whether it is the orbital period of the binary or related to the distortion of the accretion disc in outburst (superhumps). If the 51 min signal is the orbital period or close to it, this would be the helium dwarf nova with the longest orbital period known. Multiple successive outbursts are not uncommon in binaries like this..." Observers should continue to monitor ASASSN-17fp with nightly snapshots for two weeks after it fades, in case it rebrightens again. It appears to have faded, according to an observation in the AAVSO International Database by F.-J. Hambsch (HMB, Mol, Belgium), who observed it remotely from Chile on 2017 May 24.2252 UT at magnitude 19.944 CV ± 0.595. Continue nightly snapshots through June 6 at least, and if it brightens again, resume time series. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Databa! se. See full Alert Notice for more details.

  7. OPMILL - MICRO COMPUTER PROGRAMMING ENVIRONMENT FOR CNC MILLING MACHINES THREE AXIS EQUATION PLOTTING CAPABILITIES

    NASA Technical Reports Server (NTRS)

    Ray, R. B.

    1994-01-01

    OPMILL is a computer operating system for a Kearney and Trecker milling machine that provides a fast and easy way to program machine part manufacture with an IBM compatible PC. The program gives the machinist an "equation plotter" feature which plots any set of equations that define axis moves (up to three axes simultaneously) and converts those equations to a machine milling program that will move a cutter along a defined path. Other supported functions include: drill with peck, bolt circle, tap, mill arc, quarter circle, circle, circle 2 pass, frame, frame 2 pass, rotary frame, pocket, loop and repeat, and copy blocks. The system includes a tool manager that can handle up to 25 tools and automatically adjusts tool length for each tool. It will display all tool information and stop the milling machine at the appropriate time. Information for the program is entered via a series of menus and compiled to the Kearney and Trecker format. The program can then be loaded into the milling machine, the tool path graphically displayed, and tool change information or the program in Kearney and Trecker format viewed. The program has a complete file handling utility that allows the user to load the program into memory from the hard disk, save the program to the disk with comments, view directories, merge a program on the disk with one in memory, save a portion of a program in memory, and change directories. OPMILL was developed on an IBM PS/2 running DOS 3.3 with 1 MB of RAM. OPMILL was written for an IBM PC or compatible 8088 or 80286 machine connected via an RS-232 port to a Kearney and Trecker Data Mill 700/C Control milling machine. It requires a "D:" drive (fixed-disk or virtual), a browse or text display utility, and an EGA or better display. Users wishing to modify and recompile the source code will also need Turbo BASIC, Turbo C, and Crescent Software's QuickPak for Turbo BASIC. IBM PC and IBM PS/2 are registered trademarks of International Business Machines. Turbo BASIC and Turbo C are trademarks of Borland International.

  8. Reduced expression of IQGAP2 and higher expression of IQGAP3 correlates with poor prognosis in cancers

    PubMed Central

    Kumar, Dinesh; Hassan, Md. Khurshidul; Pattnaik, Niharika; Mohapatra, Nachiketa

    2017-01-01

    IQGAPs is a family of proteins which comprises three members, in humans. The expression pattern and role of IQGAP1 has been well established in many cancers, whereas those of IQGAP2 and IQGAP3, have mostly remained unexplored. We used available large datasets, to explore the pan-cancer status of these two genes in-silico. Here we have analysed their mRNA expression and correlation with survivability in eight different cancers, including lung, breast, gastric, brain, colorectal, prostate, liver and kidney cancers and, their subtypes. The mRNA expression of IQGAP2 and IQGAP3 in individual cancers were analysed in two different publicly available databases viz. Oncomine and TCGA. The prognostic value of these genes in lung, breast and gastric cancer was analysed using Kaplan-Meier Plotter database, whereas for brain, colorectal, liver, prostate and kidney cancers, SurvExpress database was used. These results were validated by immunohistochemistry in cancer tissues (stomach, prostate, brain, colorectal). Moreover, we did IQGAP2 and IQGAP3 genomic alteration and, promoter methylation analysis using cBioportal and Wanderer web tool, respectively. Most of the cancer types (lung, breast, prostate, brain, gastric, liver, kidney and colorectal) showed increased IQGAP3 mRNA expression. In contrast, the IQGAP2 transcript levels were reduced across different cancers viz. lung, breast, gastric, liver, kidney and colorectal cancer. IQGAP2 expression correlated positively with survivability, on the contrary, IQGAP3 expression levels correlated inversely with survivability, in most of the cancers. Collectively, enhanced IQGAP3 and reduced IQGAP2 levels were frequently observed in multiple cancers with the former predicting poor survivability and the later opposite. Methylation pattern was significantly altered in most of the cancer types. We found copy no. variation and mutations in specific cancers, for IQGAP2 and IQGAP3. Our in-vivo (IHC) data confirmed the in-silico findings completely. Hence, IQGAP2 and IQGAP3 have potential to be used as prognostic markers or therapeutic targets in specific cancers. PMID:29073199

  9. Reduced expression of IQGAP2 and higher expression of IQGAP3 correlates with poor prognosis in cancers.

    PubMed

    Kumar, Dinesh; Hassan, Md Khurshidul; Pattnaik, Niharika; Mohapatra, Nachiketa; Dixit, Manjusha

    2017-01-01

    IQGAPs is a family of proteins which comprises three members, in humans. The expression pattern and role of IQGAP1 has been well established in many cancers, whereas those of IQGAP2 and IQGAP3, have mostly remained unexplored. We used available large datasets, to explore the pan-cancer status of these two genes in-silico. Here we have analysed their mRNA expression and correlation with survivability in eight different cancers, including lung, breast, gastric, brain, colorectal, prostate, liver and kidney cancers and, their subtypes. The mRNA expression of IQGAP2 and IQGAP3 in individual cancers were analysed in two different publicly available databases viz. Oncomine and TCGA. The prognostic value of these genes in lung, breast and gastric cancer was analysed using Kaplan-Meier Plotter database, whereas for brain, colorectal, liver, prostate and kidney cancers, SurvExpress database was used. These results were validated by immunohistochemistry in cancer tissues (stomach, prostate, brain, colorectal). Moreover, we did IQGAP2 and IQGAP3 genomic alteration and, promoter methylation analysis using cBioportal and Wanderer web tool, respectively. Most of the cancer types (lung, breast, prostate, brain, gastric, liver, kidney and colorectal) showed increased IQGAP3 mRNA expression. In contrast, the IQGAP2 transcript levels were reduced across different cancers viz. lung, breast, gastric, liver, kidney and colorectal cancer. IQGAP2 expression correlated positively with survivability, on the contrary, IQGAP3 expression levels correlated inversely with survivability, in most of the cancers. Collectively, enhanced IQGAP3 and reduced IQGAP2 levels were frequently observed in multiple cancers with the former predicting poor survivability and the later opposite. Methylation pattern was significantly altered in most of the cancer types. We found copy no. variation and mutations in specific cancers, for IQGAP2 and IQGAP3. Our in-vivo (IHC) data confirmed the in-silico findings completely. Hence, IQGAP2 and IQGAP3 have potential to be used as prognostic markers or therapeutic targets in specific cancers.

  10. Optical enhanced luminescent measurements and sequential reagent mixing on a centrifugal microfluidic device for multi-analyte point-of-care applications

    NASA Astrophysics Data System (ADS)

    Bartholomeusz, Daniel A.; Davies, Rupert H.; Andrade, Joseph D.

    2006-02-01

    A centrifugal-based microfluidic device1 was built with lyophilized bioluminescent reagents for measuring multiple metabolites from a sample of less than 15 μL. Microfluidic channels, reaction wells, and valves were cut in adhesive vinyl film using a knife plotter with features down to 30 μm and transferred to metalized polycarbonate compact disks (CDs). The fabrication method was simple enough to test over 100 prototypes within a few months. It also allowed enzymes to be packaged in microchannels without exposure to heat or chemicals. The valves were rendered hydrophobic using liquid phase deposition. Microchannels were patterned using soft lithography to make them hydrophilic. Reagents and calibration standards were deposited and lyophilized in different wells before being covered with another adhesive film. Sample delivery was controlled by a modified CD ROM. The CD was capable of distributing 200 nL sample aliquots to 36 channels, each with a different set of reagents that mixed with the sample before initiating the luminescent reactions. Reflection of light from the metalized layer and lens configuration allowed for 20% of the available light to be collected from each channel. ATP was detected down to 0.1 μM. Creatinine, glucose, and galactose were also measured in micro and milliMolar ranges. Other optical-based analytical assays can easily be incorporated into the device design. The minimal sample size needed and expandability of the device make it easier to simultaneously measure a variety of clinically relevant analytes in point-of-care settings.

  11. Spindle pole body component 25 homolog expressed by ECM stiffening is required for lung cancer cell proliferation.

    PubMed

    Jeong, Jangho; Keum, Seula; Kim, Daehwan; You, Eunae; Ko, Panseon; Lee, Jieun; Kim, Jaegu; Kim, Jung-Woong; Rhee, Sangmyung

    2018-06-12

    Accumulating evidence has shown that matrix stiffening in cancer tissue by the deposition of extracellular matrix (ECM) is closely related with severe tumor progression. However, much less is known about the genes affected by matrix stiffness and its signaling for cancer progression. In the current research, we investigated the differential gene expression of a non-small lung adenocarcinoma cell line, H1299, cultured under the conditions of soft (∼0.5 kPa) and stiff (∼40 kPa) matrices, mimicking the mechanical environments of normal and cancerous tissues, respectively. For integrated transcriptome analysis, the genes identified by ECM stiffening were compared with 8248 genes retrieved from The Cancer Genome Atlas Lung Adenocarcinoma (TCGA). In stiff matrix, 29 genes were significantly upregulated, while 75 genes were downregulated. The screening of hazard ratios for these genes using the Kaplan-Meier Plotter identified 8 genes most closely associated with cancer progression under the condition of matrix stiffening. Among these genes, spindle pole body component 25 homolog (SPC25) was one of the most up-regulated genes in stiff matrix and tumor tissue. Knockdown of SPC25 in H1299 cells using shRNA significantly inhibited cell proliferation with downregulation of the expression of checkpoint protein, Cyclin B1, under the condition of stiff matrix whereas the proliferation rate in soft matrix was not affected by SPC25 silencing. Thus, our findings provide novel key molecules for studying the relationship of extracellular matrix stiffening and cancer progression. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Decreased expression of FOXF2 as new predictor of poor prognosis in stage I non-small cell lung cancer.

    PubMed

    Kong, Peng-Zhou; Li, Guang-Ming; Tian, Yin; Song, Bin; Shi, RuYi

    2016-08-23

    Forkhead box F2 (FOXF2) is relatively limited to the adult lung, but its contribution to non-small cell lung cancer (NSCLC) prognosis is unclear. FOXF2 mRNA levels in NSCLC were lower than that in paired normal lung tissues (P = 0.012). The FOXF2low patients had shorter survival time than the FOXF2high patients (P = 0.024) especially in stage I (P = 0.002), chemotherapy (P = 0.018) and < 60 age groups (P = 0.002). Lower FOXF2 mRNA levels could independently predict poorer survival for patients with NSCLC (HR = 2.384, 95% CI = 1.241-4.577; P = 0.009), especially in stage I (HR =4.367, 95% CI =1.599-11.925; P = 0.004). The two independent datasets confirmed our findings. We examined FOXF2 mRNA levels in 84 primary NSCLC and 8 normal lung tissues using qRT-PCR. Rank-sum tests and chi-square tests were used to assess the differences among groups with various clinicopathological factors. Kaplan-Meier tests were used to compare survival status in patients with different FOXF2 mRNA levels. Cox proportional hazards regression model was used to evaluate the predictive value of FOXF2 mRNA level in NSCLC patients. Independent validation was performed using an independent dataset (98 samples) and an online survival analysis software Kaplan-Meier plotter (1928 samples). Our results demonstrated that decreased FOXF2 expression is an independent predictive factor for poor prognosis of patients with NSCLC, especially in stage I NSCLC.

  13. Monitoring of V380 Oph requested in support of HST observations

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2012-08-01

    On behalf of a large Hubble Space Telescope consortium of which they are members, Dr. Joseph Patterson (Columbia University, Center for Backyard Astrophysics) and Dr. Arne Henden (AAVSO) requested observations from the amateur astronomer community in support of upcoming HST observations of the novalike VY Scl-type cataclysmic variable V380 Oph. The HST observations will likely take place in September but nightly visual observations are needed beginning immediately and continuing through at least October 2012. The astronomers plan to observe V380 Oph while it is in its current low state. Observations beginning now are needed to determine the behavior of this system at minimum and to ensure that the system is not in its high state at the time of the HST observations. V380 Oph is very faint in its low state: magnitude 17 to 19 and perhaps even fainter. Nightly snapshot observations, not time series, are requested, as is whatever technique - adding frames, lengthening exposur! es, etc. - necessary to measure the magnitude. It is not known whether V380 Oph is relatively inactive at minimum or has flares of one to two magnitudes; it is this behavior that is essential to learn in order to safely execute the HST observations. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details. NOTE: This campaign was subsequently cancelled when it was learned V830 Oph was not truly in its low state. See AAVSO Alert Notice 468 for details.

  14. RNA-Binding Protein Dnd1 Promotes Breast Cancer Apoptosis by Stabilizing the Bim mRNA in a miR-221 Binding Site.

    PubMed

    Cheng, Feng; Pan, Ying; Lu, Yi-Min; Zhu, Lei; Chen, Shuzheng

    2017-01-01

    RNA-binding proteins (RBPs) and miRNAs are capable of controlling processes in normal development and cancer. Both of them could determine RNA transcripts fate from synthesis to decay. One such RBP, Dead end (Dnd1), is essential for regulating germ-cell viability and suppresses the germ-cell tumors development, yet how it exerts its functions in breast cancer has remained unresolved. The level of Dnd1 was detected in 21 cancerous tissues paired with neighboring normal tissues by qRT-PCR. We further annotated TCGA (The Cancer Genome Atlas) mRNA expression profiles and found that the expression of Dnd1 and Bim is positively correlated ( p = 0.04). Patients with higher Dnd1 expression level had longer overall survival ( p = 0.0014) by KM Plotter tool. Dnd1 knockdown in MCF-7 cells decreased Bim expression levels and inhibited apoptosis. While knockdown of Dnd1 promoted the decay of Bim mRNA 3'UTR, the stability of Bim-5'UTR was not affected. In addition, mutation of miR-221-binding site in Bim-3'UTR canceled the effect of Dnd1 on Bim mRNA. Knockdown of Dnd1 in MCF-7 cells confirmed that Dnd1 antagonized miR-221-inhibitory effects on Bim expression. Overall, our findings indicate that Dnd1 facilitates apoptosis by increasing the expression of Bim via its competitive combining with miR-221 in Bim-3'UTR. The new function of Dnd1 may contribute to a vital role in breast cancer development.

  15. Cataclysmic variables to be monitored for HST observations

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2012-09-01

    Drs. Boris Gaensicke (Warwick University), Joseph Patterson (Columbia University, Center for Backyard Astrophysics), and Arne Henden (AAVSO), on behalf of a consortium of 16 astronomers, requested the help of AAVSO observers in monitoring the ~40 cataclysmic variables in support of Hubble Space Telescope observations in the coming months. The HST COS (Cosmic Origins Spectrograph) will be carrying out far-ultraviolet spectroscopy of ~40 CVs sequentially, with the aim to measure the temperatures, atmospheric compositions, rotation rates, and eventually masses of their white dwarfs. The primary purpose of the monitoring is to know whether each target is in quiescence immediately prior to the observation window; if it is in outburst it will be too bright for the HST instrumentation. Based on the information supplied by the AAVSO, the HST scheduling team will make the decision (usually) the evening before the scheduled observing time as to whether to go forward with the HST observations. For CCD observers, simultaneous photometry [shortly before, during, and after the HST observations] would be ideal. B filter would be best for a light curve, although for the magnitude estimates, V would be best. Finder charts may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. If the target is seen in outburst, please contact the AAVSO immediately and post a message to the Observations and Campaigns & Observations Reports forum (http://www.aavso.org/forum). This campaign will run the better part of a year or longer. See full Alert Notice for more details and list of objects.

  16. Politicians, Patriots and Plotters: Unlikely Debates Occasioned by Maximilian Hell's Venus Transit Expedition of 1769

    NASA Astrophysics Data System (ADS)

    Kontler, Laszlo

    2013-05-01

    This paper discusses the cultural and political contexts and reception of the most important by-product of Maximilian Hell's famous Venus transit expedition of 1768-69, the Demonstratio. Idioma Ungarorum et Lapponum idem esse (1770) by Hell's associate Janos Sajnovics. Now considered a landmark in Finno-Ugrian linguistics, the Demonstratio addressed an academic subject that was at that time almost destined to be caught up in an ideological battlefield defined by the shifting relationship between the Habsburg government, the Society of Jesus, and the Hungarian nobility. The "enlightened absolutist" policies of the former aimed at consolidating the Habsburg monarchy as an empire, at the expense of privileged groups, including religious orders as well as the noble estates. In the situation created by the 1773 suppression of the Jesuit order (a signal of declining patronage from the dynasty), the growing preoccupation on the part of ex-Jesuits like Hell and Sajnovics with "things Hungarian" could have been part of an attempt to re-situate themselves on the Central European map of learning. At the same time, the founding document of this interest, the Demonstratio, evoked violent protests from the other target of Habsburg policies, the Hungarian nobility, because its basic assumptions - the kinship of the Hungarian and the Sami (Lappian) language - potentially undermined the noble ideology of social exclusiveness, established on the alleged "Scythian" ancestry of Hungarians. By exploring the complex motives, intentions, reactions and responses of the chief agents in this story, it is possible to highlight the extra-scientific constraints and facilitators for the practice of knowledge in late eighteenth century Central Europe.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burge, S.W.

    This report describes the FORCE2 flow program input, output, and the graphical post-processor. The manual describes the steps for creating the model, executing the programs and processing the results into graphical form. The FORCE2 post-processor was developed as an interactive program written in FORTRAN-77. It uses the Graphical Kernel System (GKS) graphics standard recently adopted by International Organization for Standardization, ISO, and American National Standards Institute, ANSI, and, therefore, can be used with many terminals. The post-processor vas written with Calcomp subroutine calls and is compatible with Tektkonix terminals and Calcomp and Nicolet pen plotters. B&W has been developing themore » FORCE2 code as a general-purpose tool for flow analysis of B&W equipment. The version of FORCE2 described in this manual was developed under the sponsorship of ASEA-Babcock as part of their participation in the joint R&D venture, ``Erosion of FBC Heat Transfer Tubes,`` and is applicable to the analyses of bubbling fluid beds. This manual is the principal documentation for program usage and is segmented into several sections to facilitate usage. In Section 2.0 the program is described, including assumptions, capabilities, limitations and uses, program status and location, related programs and program hardware and software requirements. Section 3.0 is a quick user`s reference guide for preparing input, executing FORCE2, and using the post-processor. Section 4.0 is a detailed description of the FORCE2 input. In Section 5.0, FORCE2 output is summarized. Section 6.0 contains a sample application, and Section 7.0 is a detailed reference guide.« less

  18. V694 Mon (MWC 560) spectroscopy requested

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-05-01

    The observing campaign from 2016 on V694 Mon (MWC 560) (AAVSO Alert Notice 538) has been continued, but with different requirements. Photometry is no longer specifically requested on a regular basis (although ongoing observations that do not interfere with other obligations are welcome). Spectroscopy on a cadence of a week or two is requested to monitor changes in the disk outflow. Investigator Adrian Lucy writes: "Adrian Lucy and Dr. Jeno Sokoloski (Columbia University) have requested spectroscopic monitoring of the broad-absorption-line symbiotic star V694 Mon (MWC 560), as a follow-up to coordinated multi-wavelength observations obtained during its recent outburst (ATel #8653, #8832, #8957; #10281). This system is a perfect place in which to study the relationship between an accretion disk and disk winds/jets, and a high-value target for which even low-resolution spectra can be extraordinarily useful...Optical brightening in MWC 560 tends to predict higher-velocity absorption, but sometimes jumps in absorption velocity also appear during optical quiescence (e.g., Iijima 2001, ASPCS, 242, 187). If such a velocity jump occurs during photometric quiescence, it may prompt radio observations to confirm and test the proposed outflow origin for recently-discovered flat-spectrum radio emission (Lucy et al. ATel #10281)...Furthermore, volunteer spectroscopic monitoring of this system has proved useful in unpredictable ways. For example, 'amateur' spectra obtained by Somogyi Péter in 2015 December demonstrated that the velocity of absorption was very low only a month before an optical outburst peak prompted absorption troughs up to 3000 km/s, which constrains very well the timing of the changes to the outflow to a degree that would not have been otherwise possible. Any resolution can be useful. A wavelength range that can accommodate a blueshift of at least 140 angstroms (6000 km/s) from the rest wavelengths of H-alpha at 6562 angstroms and/or H-beta at 4861 angstroms is ideal, though spectra with a smaller range can still be useful. Photometry could potentially still be useful, but will be supplementary to medium-cadence photometry being collected by the ANS collaboration." "Spectroscopy may be uploaded to the ARAS database (http://www.astrosurf.com/aras/Aras_DataBase/DataBase.htm), or sent to Adrian and Jeno directly at . Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Photometry should be submitted to the AAVSO International Database. See full Special Notice for more details.

  19. Osteogenic Differentiation of Three-Dimensional Bioprinted Constructs Consisting of Human Adipose-Derived Stem Cells In Vitro and In Vivo.

    PubMed

    Wang, Xiao-Fei; Song, Yang; Liu, Yun-Song; Sun, Yu-Chun; Wang, Yu-Guang; Wang, Yong; Lyu, Pei-Jun

    2016-01-01

    Here, we aimed to investigate osteogenic differentiation of human adipose-derived stem cells (hASCs) in three-dimensional (3D) bioprinted tissue constructs in vitro and in vivo. A 3D Bio-plotter dispensing system was used for building 3D constructs. Cell viability was determined using live/dead cell staining. After 7 and 14 days of culture, real-time quantitative polymerase chain reaction (PCR) was performed to analyze the expression of osteogenesis-related genes (RUNX2, OSX, and OCN). Western blotting for RUNX2 and immunofluorescent staining for OCN and RUNX2 were also performed. At 8 weeks after surgery, osteoids secreted by osteogenically differentiated cells were assessed by hematoxylin-eosin (H&E) staining, Masson trichrome staining, and OCN immunohistochemical staining. Results from live/dead cell staining showed that most of the cells remained alive, with a cell viability of 89%, on day 1 after printing. In vitro osteogenic induction of the 3D construct showed that the expression levels of RUNX2, OSX, and OCN were significantly increased on days 7 and 14 after printing in cells cultured in osteogenic medium (OM) compared with that in normal proliferation medium (PM). Fluorescence microscopy and western blotting showed that the expression of osteogenesis-related proteins was significantly higher in cells cultured in OM than in cells cultured in PM. In vivo studies demonstrated obvious bone matrix formation in the 3D bioprinted constructs. These results indicated that 3D bioprinted constructs consisting of hASCs had the ability to promote mineralized matrix formation and that hASCs could be used in 3D bioprinted constructs for the repair of large bone tissue defects.

  20. The digital geologic map of Colorado in ARC/INFO format, Part A. Documentation

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  1. Raman spectroscopy for the detection of explosives and their precursors on clothing in fingerprint concentration: a reliable technique for security and counterterrorism issues

    NASA Astrophysics Data System (ADS)

    Almaviva, S.; Botti, S.; Cantarini, L.; Palucci, A.; Puiu, A.; Schnuerer, F.; Schweikert, W.; Romolo, F. S.

    2013-10-01

    In this work we report the results of RS measurements on some common military explosives and some of the most common explosives precursors deposited on clothing fabrics, both synthetic and natural, such as polyester, leather and denim cotton at concentration comparable to those obtained from a single fingerprint. RS Spectra were obtained using an integrated portable Raman system equipped with an optical microscope, focusing the light of a solid state GaAlAs laser emitting at 785 nm. A maximum exposure time of 10 s was used, focusing the beam in a 45 μm diameter spot on the sample. The substances were deposited starting from commercial solutions with a Micropipetting Nano-Plotter, ideal for generating high-quality spots by non-contact dispensing of sub-nanoliter volumes of liquids, in order to simulate a homogeneous stain on the fabric surface. Images acquired with a Confocal Laser Scanning Microscope provided further details of the deposition process showing single particles of micrometric volume trapped or deposited on the underlying tissues. The spectral features of each substance was clearly identified and discriminated from those belonging to the substrate fabric or from the surrounding fluorescence. Our results show that the application of RS using a microscope-based apparatus can provide interpretable Raman spectra in a fast, in-situ analysis, directly from explosive particles of some μm3 as the ones that it could be found in a single fingerprint, despite the contribution of the substrate, leaving the sample completely unaltered for further, more specific and propaedeutic laboratory analysis. The same approach can be envisaged for the detection of other illicit substances like drugs.

  2. Significance of aquaporins’ expression in the prognosis of gastric cancer

    PubMed Central

    Thapa, Saroj; Chetry, Mandika; Huang, Kaiyu; Peng, Yangpei; Wang, Jinsheng; Wang, Jiaoni; Zhou, Yingying; Shen, Yigen; Xue, Yangjing; Ji, Kangting

    2018-01-01

    Gastric carcinoma is one of the most lethal malignancy at present with leading cause of cancer-related deaths worldwide. Aquaporins (AQPs) are a family of small, integral membrane proteins, which have been evidenced to play a crucial role in cell migration and proliferation of different cancer cells including gastric cancers. However, the aberrant expression of specific AQPs and its correlation to detect predictive and prognostic significance in gastric cancer remains elusive. In the present study, we comprehensively explored immunohistochemistry based map of protein expression profiles in normal tissues, cancer and cell lines from publicly available Human Protein Atlas (HPA) database. Moreover, to improve our understanding of general gastric biology and guide to find novel predictive prognostic gastric cancer biomarker, we also retrieved ‘The Kaplan–Meier plotter’ (KM plotter) online database with specific AQPs mRNA to overall survival (OS) in different clinicopathological features. We revealed that ubiquitous expression of AQPs protein can be effective tools to generate gastric cancer biomarker. Furthermore, high level AQP3, AQP9, and AQP11 mRNA expression were correlated with better OS in all gastric patients, whereas AQP0, AQP1, AQP4, AQP5, AQP6, AQP8, and AQP10 mRNA expression were associated with poor OS. With regard to the clinicopathological features including Laurens classification, clinical stage, human epidermal growth factor receptor 2 (HER2) status, and different treatment strategy, we could illustrate significant role of individual AQP mRNA expression in the prognosis of gastric cancer patients. Thus, our results indicated that AQP’s protein and mRNA expression in gastric cancer patients provide effective role to predict prognosis and act as an essential agent to therapeutic strategy. PMID:29678898

  3. X-ray nova and LMXB V404 Cyg in rare outburst

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2015-06-01

    V404 Cyg, an X-ray nova and a low mass X-ray binary (LMXB) with black hole component, is undergoing its first reported X-ray and optical outburst since 1989. Large scale, rapid variations are being reported in wavelengths from X-ray to radio by professional and amateur astronomers worldwide. Satellite and ground-based observations have been and are continuing to be made by many members of the professional community, including S. D. Barthelmy et al. (GCN Circular 17929, 15 June 2015, Swift BAT initial detection); H. Negoro et al. (ATel #7646, 17 Jun 2015); E. Kuulkers et al. (ATel #7647, 17 June 2015, Swift observations); K. Gazeas et al. (ATel #7650, 17 June 2015, optical photometry); R. M. Wagner et al. (ATel #7655, 18 June, optical spectroscopy); K. Mooley et al. (ATel #7658, 18 June, radio observations). T. Munoz-Darias et al. report P Cyg profiles were seen on 18 Jun 2015 (ATel #7659). They note that P-Cyg profiles were also observed during the 1989 outburst (Casares et al. 1991, MNRAS, 250, 712), and that V404 Cyg is so far the only black hole X-ray transient that has shown this phenomenology. Observations in all bands are requested. Filtered observations are preferred. Please use a cadence as high as possible while obtaining a suitable s/n. If spectroscopy is possible with your equipment, it is requested. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. Precise observing instructions and other details are given in the full Alert Notice.

  4. Request to monitor 0103+59 HT Cas, 0809-76 Z Cha, 1004-69 OY Car AND Request to monitor 2147+13 LS Peg AND Request to monitor 1743-12 V378 Ser (Nova Ser 2005)

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2005-06-01

    AAVSO Alert Notice 317 has three topics. First: Drs. Christopher Mauche (Lawrence Livermore National Laboratory), Peter Wheatley (Univ. of Leicester), and Koji Mukai (NASA GSFC) have obtained time on XMM-Newton to observe HT Cas, Z Cha, or OY Car in outburst. AAVSO assistance is requested in monitoring these stars closely so we can inform them promptly when any of them go into outburst. Very prompt notification is essential, because the satellite requires 2-4 days to move to the target after the observations are triggered, and the superoutbursts of OY Car and Z Cha last only about 10 days, while the HT Cas outbursts last only a little more than 2 days. Second: Dr. Darren Baskill (Univ. of Leicester) has requested optical observations of LS Peg (currently suspected as being a DQ Her nova-like) to coincide with upcoming observations by XMM-Newton. Observations are requested from now until July 8, with time series 12 hours before and after, and also during the XMM observation. Use an Ic or V filter (Ic preferred), maximum time precision, S/N=100. Third: Dr. Alon Retter (Penn State Univ.) has requested AAVSO assistance in observing V378 Ser (Nova Serpentis 2005). Please monitor V378 Ser over the coming weeks as the nova fades and report your observations to the AAVSO. Both visual and CCD observations are encouraged. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  5. The digital geologic map of Colorado in ARC/INFO format, Part B. Common files

    USGS Publications Warehouse

    Green, Gregory N.

    1992-01-01

    This geologic map was prepared as a part of a study of digital methods and techniques as applied to complex geologic maps. The geologic map was digitized from the original scribe sheets used to prepare the published Geologic Map of Colorado (Tweto 1979). Consequently the digital version is at 1:500,000 scale using the Lambert Conformal Conic map projection parameters of the state base map. Stable base contact prints of the scribe sheets were scanned on a Tektronix 4991 digital scanner. The scanner automatically converts the scanned image to an ASCII vector format. These vectors were transferred to a VAX minicomputer, where they were then loaded into ARC/INFO. Each vector and polygon was given attributes derived from the original 1979 geologic map. This database was developed on a MicroVAX computer system using VAX V 5.4 nd ARC/INFO 5.0 software. UPDATE: April 1995, The update was done solely for the purpose of adding the abilitly to plot to an HP650c plotter. Two new ARC/INFO plot AMLs along with a lineset and shadeset for the HP650C design jet printer have been included. These new files are COLORADO.650, INDEX.650, TWETOLIN.E00 and TWETOSHD.E00. These files were created on a UNIX platform with ARC/INFO 6.1.2. Updated versions of INDEX.E00, CONTACT.E00, LINE.E00, DECO.E00 and BORDER.E00 files that included the newly defined HP650c items are also included. * Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government. Descriptors: The Digital Geologic Map of Colorado in ARC/INFO Format Open-File Report 92-050

  6. Osteogenic Differentiation of Three-Dimensional Bioprinted Constructs Consisting of Human Adipose-Derived Stem Cells In Vitro and In Vivo

    PubMed Central

    Liu, Yun-Song; Sun, Yu-chun; Wang, Yu-guang; Wang, Yong; Lyu, Pei-Jun

    2016-01-01

    Here, we aimed to investigate osteogenic differentiation of human adipose-derived stem cells (hASCs) in three-dimensional (3D) bioprinted tissue constructs in vitro and in vivo. A 3D Bio-plotter dispensing system was used for building 3D constructs. Cell viability was determined using live/dead cell staining. After 7 and 14 days of culture, real-time quantitative polymerase chain reaction (PCR) was performed to analyze the expression of osteogenesis-related genes (RUNX2, OSX, and OCN). Western blotting for RUNX2 and immunofluorescent staining for OCN and RUNX2 were also performed. At 8 weeks after surgery, osteoids secreted by osteogenically differentiated cells were assessed by hematoxylin-eosin (H&E) staining, Masson trichrome staining, and OCN immunohistochemical staining. Results from live/dead cell staining showed that most of the cells remained alive, with a cell viability of 89%, on day 1 after printing. In vitro osteogenic induction of the 3D construct showed that the expression levels of RUNX2, OSX, and OCN were significantly increased on days 7 and 14 after printing in cells cultured in osteogenic medium (OM) compared with that in normal proliferation medium (PM). Fluorescence microscopy and western blotting showed that the expression of osteogenesis-related proteins was significantly higher in cells cultured in OM than in cells cultured in PM. In vivo studies demonstrated obvious bone matrix formation in the 3D bioprinted constructs. These results indicated that 3D bioprinted constructs consisting of hASCs had the ability to promote mineralized matrix formation and that hASCs could be used in 3D bioprinted constructs for the repair of large bone tissue defects. PMID:27332814

  7. Observations of TT Ari requested in support of MOST observations

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2012-08-01

    Dr. Nikolaus Vogt (Universidad de Valparaiso, Chile) requested simultaneous photometry and spectroscopy of the novalike (VY Scl subtype) cataclysmic variable TT Ari in support of upcoming observations with the Canadian Microvariability and Oscillations of Stars (MOST) satellite 2012 September 13 through October 20. The Departamento de Fisica y Astronomia of the Valparaiso University will carry out photometry with small telescopes in central Chile but the assistance of other observers, particularly in other latitudes and longitudes, is requested. The observations are being carried out to study superhump behavior, which is still not well understood despite the amount of research done in all classes of cataclysmic variables. TT Ari exibits superhumps - both positive (the superhump period is longer than the orbital period) and negative (the superhump period is shorter than the orbital period). While positive superhumps are thought probably to be the result of an eccentric configuration in the accretion disk, the mechanism for negative superhumps is not yet understood except that it may be related to the disk's being warped out of the orbital plane, leading to complex torque phenomena. TT Ari, one of the brightest cataclysmic variables, exhibits occasional fadings of several magnitudes, from its usual high-state (maximum) magnitude of ~10.5V to a low-state magnitude as faint as 16V. These fadings occur every 20-25 years, and last between 500 and 1000 days. According to observations in the AAVSO International Database, TT Ari is currently magnitude 10.5V. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details, particularly regarding goals of the campaign, and observing instructions.

  8. QUIKVIS- CELESTIAL TARGET AVAILABILITY INFORMATION

    NASA Technical Reports Server (NTRS)

    Petruzzo, C.

    1994-01-01

    QUIKVIS computes the times during an Earth orbit when geometric requirements are satisfied for observing celestial objects. The observed objects may be fixed (stars, etc.) or moving (sun, moon, planets). QUIKVIS is useful for preflight analysis by those needing information on the availability of celestial objects to be observed. Two types of analyses are performed by QUIKVIS. One is used when specific objects are known, the other when targets are unknown and potentially useful regions of the sky must be identified. The results are useful in selecting candidate targets, examining the effects of observation requirements, and doing gross assessments of the effects of the orbit's right ascension of the ascending node (RAAN). The results are not appropriate when high accuracy is needed (e.g. for scheduling actual mission operations). The observation duration is calculated as a function of date, orbit node, and geometric requirements. The orbit right ascension of the ascending node can be varied to account for the effects of an uncertain launch time of day. The orbit semimajor axis and inclination are constant throughout the run. A circular orbit is assumed, but a simple program modification will allow eccentric orbits. The geometric requirements that can be processed are: 1) minimum separation angle between the line of sight to the object and the earth's horizon; 2) minimum separation angle between the line of sight to the object and the spacecraft velocity vector; 3) maximum separation angle between the line of sight to the object and the zenith direction; and 4) presence of the spacecraft in the earth's shadow. The user must supply a date or date range, the spacecraft orbit and inclination, up to 700 observation targets, and any geometric requirements to be met. The primary output is the time per orbit that conditions are satisfied, with options for sky survey maps, time since a user-specified orbit event, and bar graphs illustrating overlapping requirements. The output is printed in visually convenient lineprinter form but is also available on data files for use by postprocessors such as external XY plotters. QUIKVIS is written in FORTRAN 77 for batch or interactive execution and has been implemented on a DEC VAX 11/780 operating under VMS with a central memory requirement of approximately 500K of 8 bit bytes. QUIKVIS was developed in 1986 and revised in 1987.

  9. Monitoring of Swift J1357.2-0933 (CRTS J135716.8-093238) requested

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-04-01

    Dr. Gregory Sivakoff (University of Alberta) has requested AAVSO observers' assistance in monitoring the black hole X-ray binary Swift J1357.2-0933 (CRTS J135716.8-093238) during its current outburst. Sivakoff writes: "...[Because it] is at high Galactic latitude...extinction is relatively small and the bright blue nature of the outburst can be observed readily by citizen astronomers as the source fades into quiescence on a timescale of a few months. AAVSO observations will be critical in complementing multiple multi-wavelength campaigns observing this outburst...In addition, this source is known to undergo recurring rapid dips. These dips can last for 10s of seconds, and recur on timescales of a few minutes. This is a great source for AAVSO observers, particularly CCD observers to follow." From now until the object is no longer observable with your equipment (quiescence is r/i 20, V 22.3), UBV photometry is requested in the form of nightly snapshot observations (once to a few times per night). However, more frequent observations are also welcome. Sivakoff writes: "[The recurring rapid dips]...might be interesting for some observers to go after. These dips can be on the order of a minute long. Observers wishing to probe that should go for as fast observations as the telescope allows for them to get a SNR of 10-20 in their telescope. To capture the longer term evolution, an hourly cadence would be wonderful...U B V places a priority on the blue filters, which are typically harder to get at higher Galactic extinction (lower Galactic latitude). That being said, I would definitely request that some observers get V data to connect with the daily SMARTS campaign (V I J K)...if people can get down to I 17 - 18, then...Ic is good for connecting with the SMARTS campaign and LCOGT work on this source." Precision of the photometry is requested to be at least 0.05 to 0.1 magnitude if possible, with allowance for it to go down to 0.2 mag as the source becomes fainter. A S/N of 10-20, and 5 at a minimum, is requested. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  10. Archive of Digitized Analog Boomer Seismic Reflection Data Collected from Lake Pontchartrain, Louisiana, to Mobile Bay, Alabama, During Cruises Onboard the R/V ERDA-1, June and August 1992

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2008-01-01

    In June and August of 1992, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework from Lake Pontchartrain, Louisiana, to Mobile Bay, Alabama. This work was conducted onboard the Argonne National Laboratory's R/V ERDA-1 as part of the Mississippi/Alabama Pollution Project. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). A standardized naming convention was established to allow for better management of scanned trackline images within the MASH data rescue project. Each cruise received a unique field activity ID based on the year the data were collected, the first two digits of the survey vessel name, and the number of cruises made (to date) by that vessel that year (i.e. 92ER2 represents the second cruise made by the R/V ERDA-1 in 1992.) The new field activity IDs 92ER2 and 92ER4 presented in this report were originally referred to as ERDA 92-2 and ERDA 92-4 at the USGS in St. Petersburg, FL, and 92010 and 92037 at the USGS in Woods Hole, MA. A table showing the naming convention lineage for cruise IDs in the MASH data rescue series is included as a PDF. This report serves as an archive of high resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata for cruises 92ER2 and 92ER4. The boomer system uses an acoustic energy source called a plate, which consists of capacitors charged to a high voltage and discharged through a transducer in the water. The source is towed on a sled, at sea level, and when discharged emits a short acoustic pulse, or shot, which propagates through the water and sediment column. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the hydrophone receiver, and the amplitude of the reflected energy is recorded by an Edward P. Curley Lab (EPC) thermal plotter. This process is repeated at timed intervals (for example, 0.5 s) and recorded for specific intervals of time (for example, 100 ms). The timed intervals are also referred to as the shot interval or fire rate. On analog records, the recorded interval is referred to as the sweep, which is the amount of time the recorder stylus takes to sweep from the top of the record to the bottom of the record, thereby recording the amplitude of the reflected energy of one shot. In this way, consecutive recorded shots produce a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track. Many of the geophysical data collected by the USGS prior to the late 1990s were recorded in analog format and stored as paper copies. Scientists onboard made hand-written annotations onto these records to note latitude and longitude, time, line number, course heading, and geographic points of reference. Each paper roll typically contained numerous survey lines and could reach more than 90 ft in length. All rolls are stored at the USGS FISC-St. Petersburg, FL. To preserve the integrity of these records and improve accessibility, analog holdings were converted to digital files.

  11. Outburst of the recurrent nova V745 Sco

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2014-02-01

    The outburst of the recurrent nova V745 Sco (Nova Sco 1937) by Rod Stubbings (Tetoora Road, VIC, Australia) at visual magnitude 9.0 on 2014 February 6.694 UT is reported. This recurrent nova is fading quickly. Follow-up observations of all types (visual, CCD, DSLR) are strongly encouraged, as is spectroscopy; fast time-series of this nova may be useful to detect possible flaring activity as was observed during the outburst of U Scorpii in 2010. Coincident time-series by multiple observers would be most useful for such a study, with a V-filter being preferred. Observations reported to the AAVSO International Database show V745 Sco at visual mag. 10.2 on 2014 Feb. 07.85833 UT (A. Pearce, Nedlands, W. Australia). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. Previous outbursts occurred in 1937 and 1989. The 1937 outburst was detected in 1958 (in decline at magnitude 11.0 on 1937 May 11.1 UT; outburst had occurred within the previous 19 days) by Lukas Plaut on plates taken by Hendrik van Gent at the Leiden Observatory; the object was announced as Nova Sco 1937 and later assigned the GCVS name V745 Sco. The 1989 outburst was detected on 1989 August 1.55 UT by Mati Morel (MMAT, Thornton, NSW, Australia) at visual magnitude 10.4 and in decline. Dr. Bradley Schaefer (Louisiana State University) reports (2010ApJS..187..275S) in his comprehensive analysis of the 10 known galactic recurrent novae (including V745 Sco) that the median interval between recurrent novae outbursts is 24 years. The interval since the 1989 outburst of V745 Sco is 24.10 years. See the Alert Notice for additional visual and multicolor photometry and for more details.

  12. Low-cost and facile fabrication of a paper-based capillary electrophoresis microdevice for pathogen detection.

    PubMed

    Lee, Jee Won; Lee, Dohwan; Kim, Yong Tae; Lee, Eun Yeol; Kim, Do Hyun; Seo, Tae Seok

    2017-05-15

    This paper describes the development of a novel paper-based capillary electrophoresis (pCE) microdevice using mineral paper, which is durable, oil and tear resistant, and waterproof. The pCE device is inexpensive (~$1.6 per device for materials), simple to fabricate, lightweight, and disposable, so it is more adequate for point-of-care (POC) pathogen diagnostics than a conventional CE device made of glass, quartz, silicon or polymer. In addition, the entire fabrication process can be completed within 1h without using expensive clean room facilities and cumbersome photolithography procedures. A simple cross-designed pCE device was patterned on the mineral paper by using a plotter, and assembled with an OHP film via a double-sided adhesive film. After filling the microchannel with polyacrylamide gel, the injection, backbiasing, and separation steps were sequentially operated to differentiate single-stranded DNA (ssDNA) with 4 bp resolution in a 2.9cm-long CE separation channel. Furthermore, we successfully demonstrated the identification of the PCR amplicons of two target genes of Escherichia coli O157:H7 (rrsH gene, 121 bp) and Staphylococcus aureus (glnA gene, 225 bp). For accurate assignment of the peaks in the electropherogram, two bracket ladders (80 bp for the shortest and 326 bp for the longest) were employed, so the two amplicons of the pathogens were precisely identified on a pCE chip within 3min using the relative migration time ratio without effect of the CE environments. Thus, we believe that the pCE microdevice could be very useful for the separation of nucleic acids, amino acids, and ions as an analytical tool for use in the medical applications in the resource-limited environments as well as fundamental research fields. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. A novel highly flexible, simple, rapid and low-cost fabrication tool for paper-based microfluidic devices (μPADs) using technical drawing pens and in-house formulated aqueous inks.

    PubMed

    Nuchtavorn, Nantana; Macka, Mirek

    2016-05-05

    Paper-based microfluidic devices (μPADs) are capable of achieving rapid quantitative measurements of a variety of analytes inexpensively. μPADs rely on patterning hydrophilic-hydrophobic regions on a sheet of paper in order to create capillary channels within impermeable fluidic brakes on the paper. Here, we present a novel, highly flexible and low-cost fabrication method using a desktop digital craft plotter/cutter and technical drawing pens with tip size of 0.5 mm. The pens were used with either commercial black permanent ink for drawing fluidic brakes, or with specialty in-house formulated aqueous inks. With the permanent marker ink it was possible to create barriers on paper rapidly and in a variety of designs in a highly flexible manner. For instance, a design featuring eight reservoirs can be produced within 10 s for each μPAD with a consistent line width of brakes (%RSD < 1.5). Further, we investigated the optimal viscosity range of in-house formulated inks controlled with additions of poly(ethylene glycol). The viscosity was measured by capillary electrophoresis and the optimal viscosity was in the range of ∼3-6 mPa s. A functional test of these μPADs was conducted by the screening of antioxidant activity. Colorimetric measurements of flavonoid, phenolic compounds and DPPH free radical scavenging activity were carried out on μPADs. The results can be detected by the naked eye and simply quantified by using a camera phone and image analysis software. The fabrication method using technical drawing pens provides flexibility in the use of in-house formulated inks, short fabrication time, simplicity and low cost. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  14. Three-dimensional force measurements with mandibular overdentures connected to implants by ball-shaped retentive anchors. A clinical study.

    PubMed

    Mericske-Stern, R

    1998-01-01

    The purpose of this in vivo study was to determine maximum and functional forces simultaneously in three dimensions on mandibular implants supporting overdentures. The anchorage system for overdenture connection was the ball-shaped retentive anchor. Five edentulous patients, each with two mandibular ITI implants, were selected as test subjects. A novel miniaturized piezo-electric force transducer was developed for specific use with ITI implants. Force magnitudes and directions were registered under various test conditions by means of electrostatic plotter records. The test modalities were maximum biting in centric occlusion, maximum biting on a bite plate, grinding, and chewing bread. Maximum forces measured in centric occlusion and on the ipsilateral implant when using a bite plate were slightly increased in vertical and backward-forward dimension (z-, y-axis) compared to the lateral-medial direction (x-axis). On the contralateral implant, equally low values were found in all three dimensions. This may be the effect of a nonsplinted anchorage device. With the use of a bite plate, force magnitudes on the ipsilateral implant were significantly higher on the z- and y-axis than mean maximum forces in centric occlusion (P < .001). Chewing and grinding resulted in lower forces compared to maximum biting, particularly in the vertical direction. The transverse force component in backward-forward direction, however, reached magnitudes that exceeded the vertical component by 100% to 300% during chewing function. This chewing pattern had not been observed in previous investigations with bars and telescopes, and therefore appears to be specific for retentive ball anchors. The prevalent or exclusive force direction registered on both implants in the vertical direction was downward under all test conditions. In the transverse direction during maximum biting the forward direction was more frequently registered, while no obvious prevalence of transverse force direction was observed during chewing and grinding.

  15. KIC 8462852 optical dipping event

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2017-05-01

    T. Boyajian (Louisiana State University) et al. reported in ATel #10405 that an optical dip is underway in KIC 8462852 (Boyajian's Star, Tabby's Star) beginning on 2017 May 18 UT. Tentative signs of small dips had been seen beginning April 24, and enhanced monitoring had begun at once at Fairborn Observatory (Tennessee State University). Photometry and spectroscopy from there on May 18 and 19 UT showed a dip underway. Cousins V photometry showed a drop of 0.02 magnitude, the largest dip (and the first clear one) seen in more than a year of monitoring. AAVSO observer Bruce Gary (GBL, Hereford, AZ) carried out V photometry which showed a fading from 11.906 V ± 0.004 to 11.9244 V ± 0.0033 between UT 2017 May 14 and May 19, a drop of 1.7%. Swift/UVOT observations obtained May 18 15:19 did not show a statistically significant drop in v, but Gary's photometry is given more weight. r'-band observations from Las Cumbres Observatory obtained 2017 May 17 to May 19 showed a 2% dip. Spectra by I. Steele (Liverpool JMU) et al. taken on 2017 May 20 with the 2.0 meter Liverpool Telescope, La Palma, showed no differences in the source compared to a reference spectrum taken 2016 July 4 when the system was not undergoing a dip (ATel #10406).Dips typically last for a few days, and larger dips can last over a week. It is not clear that this dip is over. Precision time-series V photometry is urgently requested from AAVSO observers, although all photometry is welcome. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). See full Alert Notice for more details. KIC 8462852 was the subject of AAVSO Alert Notices 532 and 542. See also Boyajian et al. 2016, also available as a preprint (http://arxiv.org/abs/1509.03622). General information about KIC 8462852 may be found at http://www.wherestheflux.com/.

  16. Observations of the eclipsing binary b Persei

    NASA Astrophysics Data System (ADS)

    Templeton, Matthew R.

    2015-01-01

    Dr. Robert Zavala (USNO-Flagstaff) et al. request V time-series observations of the bright variable star b Persei 7-21 January 2015 UT, in hopes of catching a predicted eclipse on January 15. This is a follow-up to the February 2013 campaign announced in Alert Notice 476, and will be used as a photometric comparison for upcoming interferometric observations with the Navy Precision Optical Interferometer (NPOI) in Arizona. b Per (V=4.598, B-V=0.054) is ideal for photoelectric photometers or DSLR cameras. Telescopic CCD observers may observe by stopping down larger apertures. Comparison and check stars assigned by PI: Comp: SAO 24412, V=4.285, B-V = -0.013; Check: SAO 24512, V=5.19, B-V = -0.05. From the PI: "[W]e wanted to try and involve AAVSO observers in a follow up to our successful detection of the b Persei eclipse of Feb 2013, AAVSO Alert Notice 476 and Special Notice 333. Our goal now is to get good time resolution photometry as the third star passes in front of the close ellipsoidal binary. The potential for multiple eclipses exists. The close binary has a 1.5 day orbital period, and the eclipsing C component requires about 4 days to pass across the close binary pair. The primary eclipse depth is 0.15 magnitude. Photometry to 0.02 or 0.03 mags would be fine to detect this eclipse. Eclipse prediction date (JD 2457033.79 = 2015 01 11 UT, ~+/- 1 day) is based on one orbital period from the 2013 eclipse." More information is available at PI's b Persei eclipse web page: http://inside.warren-wilson.edu/~dcollins/bPersei/. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and information on the targets.

  17. Observations of V420 Aur (HD 34921) needed to support spectroscopy

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-10-01

    Marcella Wijngaarden and Kelly Gourdji (graduate students at the University of Amsterdam/Anton Pannekoek Institute for Astronomy) have requested AAVSO observers' assistance in providing optical photometry of V420 Aur in support of their high-resolution spectroscopy with the Mercator telescope + Hermes spectrograph in La Palma 2016 October 7 through 17. They write: "[V420 Aur (HD 34921) is] the optical Be star that is part of a peculiar High Mass X-ray Binary...[that exhibits highly] complex and variable spectra...it is difficult to construct a physical model of this HMXB system, though based on these observations, the system is thought to contain a B[e] star with a dense plasma region, an accretion disk around a neutron star, a shell and circumstellar regions of cold dust. It has been over a decade since the last spectra were taken, and, given the highly variable nature of this star, we expect new observations to yield new information that will contribute to a better understanding of this system." Observations in BVRI (preferred over other bands) are requested beginning immediately and continuing through October 24. In all cases, timeseries in a few bands (i.e. BVRI) are preferred over single/a few observations in the other bands as it is the variability on relatively short timescales that is most important. "The target is bright so exposures should be long enough to reach good signal to noise in order to see the small variability amplitude but without saturating the target/comparison stars. We will study the variability on several timescales, so observations starting from a few per night to high cadence timeseries are useful." Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  18. Ink Jet For Business Graphic Application

    NASA Astrophysics Data System (ADS)

    Hooper, Dana H.

    1987-04-01

    This talk covers the use of Computer generated color output in the preparation of professional, memorable presentations. The focus is on this application and today's business graphic marketplace. To provide a background, on overview of the factors and trends influencing the market for color hard copy output is essential. The availability of lower cost computing technology, improved graphic software and user interfaces and the availability of color copiers is combining with the latest generation of color ink jet printers to cause a strong growth in the use of color hardcopy devices in the business graphics marketplace. The market is expected to grow at a compound annual growth rate in excess of 25% and reach a level of 5 Billion by 1990. Color lasography and ink jet technology based products are expected to increase share significantly primarily at the expense of pen plotters. Essential to the above mentioned growth is the latest generation of products. The Xerox 4020 Color Ink Jet Printer embodies the latest ink jet technology and is a good example of this new generation of products. The printer brings highly reliable color to a broad range of business users. The 4020 is driven by over 50 software packages allowing users compatibility and supporting a variety of applications. The 4020 is easy to operate and maintain and capable of producing excellent hardcopy and transparencies at an attractive price point. Several specific applications areas were discussed. Images were typically created on an IBM PC or compatible with a graphics application package and output to the Xerox 4020 Color Ink Jet Printer. Bar charts, line graphs, pie charts, integrated text and graphics, reports and maps were displayed with a brief description. Additionally, the use of color in brainscanning to discern and communicate information and in computer generated Art demonstrate the wide variety of potential applications. Images may be output to paper or to transparency for overhead presentation. The future of color in the business graphics market looks bright and will continue to be strongly influenced by future product introductions.

  19. FO Aqr time-series observations requested

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-07-01

    Dr. Colin Littlefield (University of Notre Dame) and colleagues Drs. Peter Garnavich (Notre Dame), Erin Aadland (Minnesota State), and Mark Kennedy (University College Cork) have requested AAVSO assistance in monitoring the intermediate polar cataclysmic variable FO Aqr beginning immediately. Littlefield, who with his colleagues recently published ATel #9216 and #9225, writes: "This system is in a faint state for the first time in its observational record, implying a dropoff in the mass-transfer rate. AAVSO observations contributed by Shawn Dvorak [the only observer following FO Aqr at the time] were particularly helpful in detecting this low state. Since early May, the system has recovered to V 15, but it is still well below its normal brightness. In addition, our time-series photometry shows a very strong 11.26-minute photometric period. By contrast, during its bright state, FO Aqr's light curve is dominated by a 20.9-minute period, corresponding with the spin period of the white dwarf. We interpret our observations as evidence that the system's accretion processes have changed dramatically as a result of the reduced mass-transfer rate. We have...determined that...[the 11.26-min] periodicity is dependent on the orbital phase of the binary. The 11.26-min period is dominant for about half of the orbit, but for the other half, a 22.5-min period is stronger. AAVSO observers can help us study both of these periods as well as their dependence on the orbital phase. We are particularly interested in any changes in this behavior as the system continues to brighten...Time-series photometry of FO Aqr [is requested] in order to better study the evolution of the 11.26-minute period as the system rebrightens. Unfiltered photometry reduced with a V zeropoint would be the most useful to us...A cadence of less than 60 seconds per image is important, given the brevity of these periods (especially the 11.26-min period). Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  20. Expression and potential mechanism of metabolism-related genes and CRLS1 in non-small cell lung cancer.

    PubMed

    Feng, Hai-Ming; Zhao, Ye; Zhang, Jian-Ping; Zhang, Jian-Hua; Jiang, Peng; Li, Bin; Wang, Cheng

    2018-02-01

    Cardiolipin (CL) is a phospholipid localized in the mitochondria, which is essential for mitochondrial structure and function. Human cardiolipin synthase 1 (CRLS1) is important in regulating phosphatidylglycerol (PG) remodeling and CL biosynthesis. However, the expression and distinct prognostic value of CRLS1 in neoplasms, including non-small cell lung cancer (NSCLC), is not well established. In the present study, the mRNA expression of CRLS1 was investigated using Oncomine analysis and the prognostic value was assessed using the Kaplan-Meier plotter database for patients with NSCLC. The results of the analyses indicated that the expression of CRLS1 in lung cancer was lower, compared with that in normal lung tissues. Notably, a high expression of CRLS1 was found to be associated with improved overall survival (OS) in all patients with NSCLC and lung adenocarcinoma (Ade). However, this was not observed in patients with squamous cell carcinoma (SCC). The results also demonstrated an association between the mRNA expression of CRLS1 and the clinicopathological parameters of patients with NSCLC, including sex, smoking status, tumor grade, clinical stage, lymph node status and chemotherapy. These results indicated that CRLS1 was associated with improved prognosis in patients with NSCLC, particularly at an early stage (T1N1M0). In addition, it was revealed that CRLS1 was co-expressed with well-known genes associated with metabolism using Gene Ontology term enrichment analysis. Kyoto Encyclopedia of Genes and Genomes pathway analysis also showed that tumor-related metabolism and the mitogen-activated protein kinase (MAPK) signaling pathways were enriched with CRLS1-co-expression genes. The results of the present study suggested that CRLS1 may be a novel tumor suppressor involved in regulating lipid and seleno-amino acid metabolism in the tumor microenvironment, and suppressing the MAPK signaling pathway during tumorigenesis and development. Comprehensive evaluation of the expression, prognosis and potential mechanism of CRLS1 is likely to promote an improved understanding of the complexity of the molecular biology of NSCLC.

  1. Mechanical behavior and failure analysis of prosthetic retaining screws after long-term use in vivo. Part 3: Preload and tensile fracture load testing.

    PubMed

    Al Jabbari, Youssef S; Fournelle, Raymond; Ziebert, Gerald; Toth, Jeffrey; Iacopino, Anthony M

    2008-04-01

    The aim of this study was to determine the preload and tensile fracture load values of prosthetic retaining screws after long-term use in vivo compared to unused screws (controls). Additionally, the investigation addressed whether the preload and fracture load values of prosthetic retaining screws reported by the manufacturer become altered after long-term use in vivo. For preload testing, 10 new screws (controls) from Nobel Biocare (NB) and 73 used retaining screws [58 from NB and 15 from Sterngold (SG)] were subjected to preload testing. For tensile testing, eight controls from NB and 58 used retaining screws (46 from NB and 12 from SG) were subjected to tensile testing. Used screws for both tests were in service for 18-120 months. A custom load frame, load cell, and torque wrench setup were used for preload testing. All 83 prosthetic screws were torqued once to 10 Ncm, and the produced preload value was recorded (N) using an X-Y plotter. Tensile testing was performed on a universal testing machine and the resulting tensile fracture load value was recorded (N). Preload and tensile fracture load values were analyzed with 2-way ANOVA and Tukey post-hoc tests. There was a significant difference between preload values for screws from NB and screws from SG (p < 0.001). The preload values for gold alloy screws from NB decreased as the number of years in service increased. There was a significant difference between tensile fracture values for the three groups (gold alloy screws from NB and SG and palladium alloy screws from NB) at p < 0.001. The tensile fracture values for gold alloy screws from NB and SG decreased as the number of years in service increased. In fixed detachable hybrid prostheses, perhaps as a result of galling, the intended preload values of prosthetic retaining screws may decrease with increased in-service time. The reduction of the fracture load value may be related to the increase of in-service time; however, the actual determination of this relationship is not possible from this study alone.

  2. A new eye-safe UV Raman spectrometer for the remote detection of energetic materials in fingerprint concentrations: Characterization by PCA and ROC analyzes.

    PubMed

    Almaviva, Salvatore; Chirico, Roberto; Nuvoli, Marcello; Palucci, Antonio; Schnürer, Frank; Schweikert, Wenka

    2015-11-01

    We report the results of proximal Raman investigations at a distance of 7 m, to detect traces of explosives (from 0.1 to 0.8 mg/cm(2)) on common clothes with a new eye-safe apparatus. The instrument excites the target with a single laser shot of few ns (10(-9)s) in the UV range (laser wavelength 266 nm) detecting energetic materials like Pentaerythritol tetranitrate (PETN), Trinitrotoluene (TNT), Urea Nitrate (UN) and Ammonium Nitrate (AN). Samples were prepared using a piezoelectric-controlled plotter device to realize well-calibrated amounts of explosives on several cm(2). Common fabrics and tissues such as polyester, polyamide and leather were used as substrates, representative of base-materials used in the production of jackets or coats. Other samples were prepared by touching the substrate with a silicon finger contaminated with explosives, to simulate a spot left by contaminated hands on a jacket or bag during the preparation of an improvised explosive device (IED) by a terrorist. The observed Raman signals showed some peculiar molecular bands of the analyzed compounds, allowing us to identify and discriminate them with high sensitivity and selectivity, also in presence of the interfering signal from the underlying fabric. A dedicated algorithm was developed to remove noise and fluorescence background from the single laser shot spectra and an automatic spectral recognition procedure was also implemented, evaluating the intensity of the characteristic Raman bands of each explosive and allowing their automatic classification. Principal component analysis (PCA) was used to show the discrimination potentialities of the apparatus on different sets of explosives and to highlight possible criticalities in the detection. Receiver operating characteristic (ROC) curves were used to discuss and quantify the sensitivity and the selectivity of the proposed recognition procedure. To our knowledge the developed device is at the highest sensitivity nowadays achievable in the field of eye-safe, Raman devices for proximal detection. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Eye-safe UV Raman spectroscopy for remote detection of explosives and their precursors in fingerprint concentration

    NASA Astrophysics Data System (ADS)

    Almaviva, S.; Angelini, F.; Chirico, R.; Palucci, A.; Nuvoli, M.; Schnuerer, F.; Schweikert, W.; Romolo, F. S.

    2014-10-01

    We report the results of Raman investigation performed at stand-off distance between 6-10 m with a new apparatus, capable to detect traces of explosives with surface concentrations similar to those of a single fingerprint. The device was developed as part of the RADEX prototype (RAman Detection of EXplosives) and is capable of detecting the Raman signal with a single laser shot of few ns (10-9 s) in the UV range (wavelength 266 nm), in conditions of safety for the human eye. This is because the maximum permissible exposure (MPE) for the human eye is established to be 3 mJ/cm2 in this wavelength region and pulse duration. Samples of explosives (PETN, TNT, Urea Nitrate, Ammonium Nitrate) were prepared starting from solutions deposited on samples of common fabrics or clothing materials such as blue jeans, leather, polyester or polyamide. The deposition process takes place via a piezoelectric-controlled plotter device, capable of producing drops of welldefined volume, down to nanoliters, on a surface of several cm2, in order to carefully control the amount of explosive released to the tissue and thus simulate a slight stain on a garment of a potential terrorist. Depending on the type of explosive sampled, the detected density ranges from 0.1 to 1 mg/cm2 and is comparable to the density measured in a spot on a dress or a bag due to the contact with hands contaminated with explosives, as it could happen in the preparation of an improvised explosive device (IED) by a terrorist. To our knowledge the developed device is at the highest detection limits nowadays achievable in the field of eyesafe, stand-off Raman instruments. The signals obtained show some vibrational bands of the Raman spectra of our samples with high signal-to-noise ratio (SNR), allowing us to identify with high sensitivity (high number of True Positives) and selectivity (low number of False Positives) the explosives, so that the instrument could represent the basis for an automated and remote monitoring device.

  4. Overexpression of Pofut1 and activated Notch1 may be associated with poor prognosis in breast cancer.

    PubMed

    Wan, Guoxing; Tian, Lin; Yu, Yuandong; Li, Fang; Wang, Xuanbin; Li, Chen; Deng, Shouheng; Yu, Xiongjie; Cai, Xiaojun; Zuo, Zhigang; Cao, Fengjun

    2017-09-09

    The present study was to evaluate the prognostic value of protein expression of Pofut1 and Notch1 signaling in breast cancer. Formalin-fixed paraffin-embedded 314 breast specimens including 174 infiltrating ductal carcinoma(IDC), 50 ductal carcinoma in situ(DCIS) and 90 adjacent normal tissue(ANT) were immunohistochemically examined to evaluate the protein expression of Pofut1, activated Notch1(N1IC) and Slug on specimens. Survival analysis was performed by Kaplan-Meier method and Cox's proportional-hazards model. A online database was computationally used to further explore the prognostic role of Pofut1 and Notch1 mRNA expression by Kaplan-Meier Plotter. Pofut1, Slug and N1IC expression were significantly increased in IDC compared to ANT(all p < 0.05). High expression of Pofut1, Slug and N1IC were associated with tumor aggressiveness including lymph node metastasis (LNM: p = 0.005 for Pofut1, p < 0.001 for N1IC, p = 0.017 for Slug), advanced stage(p = 0.039 for Pofut1, p = 0.025 for N1IC) and higher histological grade(p = 0.001 for N1IC). Additionally, high expression of Pofut1 was found to be significantly associated with high expressions of N1IC and Slug in IDC(r = 0.244, p = 0.001; r = 0.374, p < 0.001, respectively), similar correlation was also observed between high N1IC and Slug expression(r = 0.496, p < 0.001). Moreover, Kaplan-Meier and Cox's regression analysis indicated the significant prognostic value of elevated Pofut1, N1IC, Slug expressions, positive LNM and advanced tumor stage for the prediction of a shorter disease-free survival (DFS) and overall survival(OS). The web-based analysis also suggested a significant association of high Pofut1 and Notch1 mRNA expression with worse survival outcome. Our findings suggested that overexpression of Pofut1 and activated Notch1 signaling may be associated with a poor prognosis in breast cancer. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Monitoring of V2487 Oph requested

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-10-01

    Dr. Ashley Pagnotta (Louisiana State University) has requested AAVSO assistance in monitoring the recurrent nova V2487 Oph in order to catch and observe its next outburst. Pagnotta writes: "V2487 Oph is a recurrent nova that was first seen to erupt in 1998. During a search of the Harvard College Observatory plate archives for previous eruptions, we found one that was recorded in 1900. Based on the speed and magnitude of the eruption, and the coverage of the archival plates and other detection sources, we calculated how often V2487 Oph would have to erupt for us to have actually detected one random outburst on the plates, which is about once every 18-20 years (for more, Pagnotta et al. 2009AJ....138.1230P). As we are now 18 years from the previous (1998) eruption, we request regular AAVSO observations to help us detect the next eruption of V2487 Oph. Because V2487 Oph is a very fast nova, we are requesting a high cadence [when the outburst occurs. Previous outbursts have been as bright as V=9.5.]...Once the eruption has been confirmed (likely by other AAVSO observers, thanks to the flexibility of your observing programs), we will notify collaborators and invoke ToO observations to observe the eruption as comprehensively as possible." Observers are requested to make nightly observations in V or Clear. If V2487 Oph is brighter than V=17.5, please report the observation(s) to the AAVSO immediately and switch to multi-color (UBVRI or Sloan equivalents; Clear if other filters are not available) and high (fast) cadence time-series - exposures of a few minutes, with a S/N of at least 40-50. Continue at high cadence until the decline is underway. Time-series observations during the decline are not absolutely essential, but they would be useful to continue to look for flares and the late time dips that were seen in U Sco around days 41-61. Nightly observations as before should be continued until the star has faded to V=17.5, and then for two weeks m! ore. Finder charts with sequence may be created using the AAVS! O Variab le Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  6. R Aqr observing campaign

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2016-01-01

    Dr. George Wallerstein (University of Washington) has requested AAVSO coverage of the long period/symbiotic variable R Aquarii beginning immediately in support of high resolution spectroscopic observations planned for 2016 January 19 and 21. Several other astronomers, including Drs. Lee Anne Willson (Iowa State University), Ulisse Munari (INAF, Astronomical Observatory of Padua, Italy), and Fred Walter (Stony Brook University) are studying R Aqr closely and additional spectroscopic and other observations are planned for the near future. R Aqr is both a Mira (M) and a symbiotic (ZAND) - it is a close binary system consisting of a hot star and a late-type star (the Mira), both enveloped in nebulosity. As a result, the very interesting light curve shows not only the Mira pulsation but also complex eclipse behavior as the two stars interact. The period of Mira variation is 387.0 days; the eclipse period is 43.6-44 years. The cause of the eclipse is unknown; several theories h! ave been proposed, including a focused accretion stream, a disk or cloud around the secondary, and a triggered mass loss that produces an opaque cloud. Careful investigation of this upcoming event should help to resolve this question. The last eclipse of R Aqr was in 1978. The next eclipse is predicted for 2022, but may be early. The current behavior of R Aqr suggests that the eclipse, which lasts for several years, may either be beginning or its beginning may be imminent. R Aqr was at minimum in early December 2015 at magnitude V=11.4, and is currently at visual magnitude 11.0. During this phase of the approximately 44-year eclipse cycle, at maximum it may be as bright as V 6.0-6.5 but is not expected to become brighter. Beginning immediately, nightly BVRI CCD and DSLR photometry and visual observations are requested. As R Aqr brightens towards maximum and is in range, PEP observations are also requested. Ongoing spectroscopy over the next several years will be interesting to see as the system evolves throughout the eclipse. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (https://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details.

  7. Seismic refraction profile, Kingdom of Saudi Arabia: field operations, instrumentation, and initial results

    USGS Publications Warehouse

    Blank, H. Richard; Healy, J.H.; Roller, John; Lamson, Ralph; Fisher, Fred; McClearn, Robert; Allen, Steve

    1979-01-01

    In February 1978 a seismic deep-refraction profile was recorded by the USGS along a 1000-km line across the Arabian Shield in western Saudi Arabia. The line begins in Paleozoic and Mesozoic cover rocks near Riyadh on the Arabian Platform, leads southwesterly across three major Precambrian tectonic provinces, traverses Cenozoic rocks of the coastal plain near Jizan (Tihamat Asir), and terminates at the outer edge of the Farasan Bank in the southern Red Sea. More than 500 surveyed recording sites were occupied, including 19 in the Farasan Islands. Six shot points were used--five on land, with charges placed mostly below water table in drill holes, and one at sea, with charges placed on the sea floor and fired from a ship. The total charge consumed was slightly in excess of 61 metric tons in 21 discrete firings. Seismic energy was recorded by means of a set of 100 newly developed portable seismic stations. Each station consists of a standard 2-Hz vertical geophone coupled to a self-contained analog recording instrument equipped with a magnetic-tape cassette. The stations were deployed in groups of 20 by five observer teams, each generally consisting of two scientist-technicians and a surveyor-guide. On the day prior to deployment, the instruments were calibrated and programmed for automatic operation by means of a specially designed device called a hand-held tester. At each of ten pre-selected recording time windows on a designated firing day, the instruments were programmed to turn on, stabilize, record internal calibration signals, record the seismic signals at three levels of amplification, and then deactivate. After the final window in the firing sequence, all instruments were retrieved and their data tapes removed for processing. A specially designed, field tape- dubbing system was utilized at shot point camps to organize and edit data recorded on the cassette tapes. The main functions of this system are to concatenate all data from each shot on any given day onto a single shot tape, and to provide hard copy for monitoring recorder performance so that any problems can be corrected prior to the next deployment. Composite digital record sections were produced from the dubbed tapes for each shot point by a portable processing and plotting system. The heart of this system is a DEC PDP 11VO3 computer, which controls a cassette playback unit identical to those used in the recorders and dubbers, a set of discriminators, a time-code translator, a digitizer, and a digital plotter. The system was used to maintain various informational data sets and to produce tabulations and listings of various sorts during the field operations, in addition to its main task of producing digital record sections. Two master clocks, both set to time signals broadcast by the British Broadcasting Corporation, provided absolute time for the recording operations. One was located on the ship and the other was stationed at a base camp on the mainland. The land-based master clock was used to set three additional master clocks located at the other active shot points a few days in advance of each firing, and these clocks were then used to set the internal clocks in the portable seismic stations via the hand-held tester. A master clock signal was also linked to the firing system at each shot point for determination of the absolute shot instant. It is possible to construct a generalized crustal model from examination of the six shot point composite record sections obtained in the field. Such a model rests upon a number of simplifying assumptions and will almost certainly be modified at a later stage of interpretation. The main assumptions are that the crust consists of two homogeneous isotropic layers having no velocity inversion,, that the Mohorovicic discontinuity is sharp, and that effects of surface inhomogeneities and elevation changes can be ignored. The main characteristics of the tentative model are the following: (1) The thickness of th

  8. Prognostic value of alcohol dehydrogenase mRNA expression in gastric cancer.

    PubMed

    Guo, Erna; Wei, Haotang; Liao, Xiwen; Xu, Yang; Li, Shu; Zeng, Xiaoyun

    2018-04-01

    Previous studies have reported that alcohol dehydrogenase (ADH) isoenzymes possess diagnostic value in gastric cancer (GC). However, the prognostic value of ADH isoenzymes in GC remains unclear. The aim of the present study was to identify the prognostic value of ADH genes in patients with GC. The prognostic value of ADH genes was investigated in patients with GC using the Kaplan-Meier plotter tool. Kaplan-Meier plots were used to assess the difference between groups of patients with GC with different prognoses. Hazard ratios (HR) and 95% confidence intervals (CI) were used to assess the relative risk of GC survival. Overall, 593 patients with GC and 7 ADH genes were included in the survival analysis. High expression of ADH 1A (class 1), α polypeptide ( ADH1A; log-rank P=0.043; HR=0.79; 95% CI: 0.64-0.99), ADH 1B (class 1), β polypeptide ( ADH1B ; log-rank P=1.9×10 -05 ; HR=0.65; 95% CI: 0.53-0.79) and ADH 5 (class III), χ polypeptide ( ADH5 ; log-rank P=0.0011; HR=0.73; 95% CI: 0.6-0.88) resulted in a significantly decreased risk of mortality in all patients with GC compared with patients with low expression of those genes. Furthermore, protective effects may additionally be observed in patients with intestinal-type GC with high expression of ADH1B (log-rank P=0.031; HR=0.64; 95% CI: 0.43-0.96) and patients with diffuse-type GC with high expression of ADH1A (log-rank P=0.014; HR=0.51; 95% CI: 0.3-0.88), ADH1B (log-rank P=0.04; HR=0.53; 95% CI: 0.29-0.98), ADH 4 (class II), π polypeptide (log-rank P=0.033; HR=0.58; 95% CI: 0.35-0.96) and ADH 6 (class V) (log-rank P=0.037; HR=0.59; 95% CI: 0.35-0.97) resulting in a significantly decreased risk of mortality compared with patients with low expression of those genes. In contrast, patients with diffuse-type GC with high expression of ADH5 (log-rank P=0.044; HR=1.66; 95% CI: 1.01-2.74) were significantly correlated with a poor prognosis. The results of the present study suggest that ADH1A and ADH1B may be potential prognostic biomarkers of GC, whereas the prognostic value of other ADH genes requires further investigation.

  9. The voltage gated Ca(2+)-channel Cav3.2 and therapeutic responses in breast cancer.

    PubMed

    Pera, Elena; Kaemmerer, Elke; Milevskiy, Michael J G; Yapa, Kunsala T D S; O'Donnell, Jake S; Brown, Melissa A; Simpson, Fiona; Peters, Amelia A; Roberts-Thomson, Sarah J; Monteith, Gregory R

    2016-01-01

    Understanding the cause of therapeutic resistance and identifying new biomarkers in breast cancer to predict therapeutic responses will help optimise patient care. Calcium (Ca(2+))-signalling is important in a variety of processes associated with tumour progression, including breast cancer cell migration and proliferation. Ca(2+)-signalling is also linked to the acquisition of multidrug resistance. This study aimed to assess the expression level of proteins involved in Ca(2+)-signalling in an in vitro model of trastuzumab-resistance and to assess the ability of identified targets to reverse resistance and/or act as potential biomarkers for prognosis or therapy outcome. Expression levels of a panel of Ca(2+)-pumps, channels and channel regulators were assessed using RT-qPCR in resistant and sensitive age-matched SKBR3 breast cancer cells, established through continuous culture in the absence or presence of trastuzumab. The role of Cav3.2 in the acquisition of trastuzumab-resistance was assessed through pharmacological inhibition and induced overexpression. Levels of Cav3.2 were assessed in a panel of non-malignant and malignant breast cell lines using RT-qPCR and in patient samples representing different molecular subtypes (PAM50 cohort). Patient survival was also assessed in samples stratified by Cav3.2 expression (METABRIC and KM-Plotter cohort). Increased mRNA of Cav3.2 was a feature of both acquired and intrinsic trastuzumab-resistant SKBR3 cells. However, pharmacological inhibition of Cav3.2 did not restore trastuzumab-sensitivity nor did Cav3.2 overexpression induce the expression of markers associated with resistance, suggesting that Cav3.2 is not a driver of trastuzumab-resistance. Cav3.2 levels were significantly higher in luminal A, luminal B and HER2-enriched subtypes compared to the basal subtype. High levels of Cav3.2 were associated with poor outcome in patients with oestrogen receptor positive (ER+) breast cancers, whereas Cav3.2 levels were correlated positively with patient survival after chemotherapy in patients with HER2-positive breast cancers. Our study identified elevated levels of Cav3.2 in trastuzumab-resistant SKBR3 cell lines. Although not a regulator of trastuzumab-resistance in HER2-positive breast cancer cells, Cav3.2 may be a potential differential biomarker for survival and treatment response in specific breast cancer subtypes. These studies add to the complex and diverse role of Ca(2+)-signalling in breast cancer progression and treatment.

  10. Prognostic value of alcohol dehydrogenase mRNA expression in gastric cancer

    PubMed Central

    Guo, Erna; Wei, Haotang; Liao, Xiwen; Xu, Yang; Li, Shu; Zeng, Xiaoyun

    2018-01-01

    Previous studies have reported that alcohol dehydrogenase (ADH) isoenzymes possess diagnostic value in gastric cancer (GC). However, the prognostic value of ADH isoenzymes in GC remains unclear. The aim of the present study was to identify the prognostic value of ADH genes in patients with GC. The prognostic value of ADH genes was investigated in patients with GC using the Kaplan-Meier plotter tool. Kaplan-Meier plots were used to assess the difference between groups of patients with GC with different prognoses. Hazard ratios (HR) and 95% confidence intervals (CI) were used to assess the relative risk of GC survival. Overall, 593 patients with GC and 7 ADH genes were included in the survival analysis. High expression of ADH 1A (class 1), α polypeptide (ADH1A; log-rank P=0.043; HR=0.79; 95% CI: 0.64–0.99), ADH 1B (class 1), β polypeptide (ADH1B; log-rank P=1.9×10−05; HR=0.65; 95% CI: 0.53–0.79) and ADH 5 (class III), χ polypeptide (ADH5; log-rank P=0.0011; HR=0.73; 95% CI: 0.6–0.88) resulted in a significantly decreased risk of mortality in all patients with GC compared with patients with low expression of those genes. Furthermore, protective effects may additionally be observed in patients with intestinal-type GC with high expression of ADH1B (log-rank P=0.031; HR=0.64; 95% CI: 0.43–0.96) and patients with diffuse-type GC with high expression of ADH1A (log-rank P=0.014; HR=0.51; 95% CI: 0.3–0.88), ADH1B (log-rank P=0.04; HR=0.53; 95% CI: 0.29–0.98), ADH 4 (class II), π polypeptide (log-rank P=0.033; HR=0.58; 95% CI: 0.35–0.96) and ADH 6 (class V) (log-rank P=0.037; HR=0.59; 95% CI: 0.35–0.97) resulting in a significantly decreased risk of mortality compared with patients with low expression of those genes. In contrast, patients with diffuse-type GC with high expression of ADH5 (log-rank P=0.044; HR=1.66; 95% CI: 1.01–2.74) were significantly correlated with a poor prognosis. The results of the present study suggest that ADH1A and ADH1B may be potential prognostic biomarkers of GC, whereas the prognostic value of other ADH genes requires further investigation. PMID:29552190

  11. Can we get a better knowledge on dissolution processes in chalk by using microfluidic chips?

    NASA Astrophysics Data System (ADS)

    Neuville, Amélie; Minde, Mona; Renaud, Louis; Vinningland, Jan Ludvig; Dysthe, Dag Kristian; Hiorth, Aksel

    2017-04-01

    This work has been initiated in the context of research on improving the oil recovery in chalk bedrocks. One of the methods to improve the oil recovery is to inject "smart water" (acidic water/brines). Experiments on core scale and field tests that have been carried out the last decade have clearly shown that water chemistry affects the final oil recovery. However, there is generally no consensus in the scientific community of why additional oil is released, and it is also still not understood what are the mineralogical and structural changes. Direct in situ observation of the structural changes that occur when chalk is flooded with brines could resolve many of the open questions that remain. One of the highlights of this work is thus the development of an innovative methodology where fluid/rock interactions are observed in-situ by microscopy. To do so, we create several types of custom-made microfluidic systems that embeds reactive materials like chalk and calcite. The methodology we develop can be applied to other reactive materials. We will present an experiment where a calcite window dissolves with a fluid, where we observe in-situ the topography features of the calcite window, as well as the dissolution rate [1]. The injected fluid circulates at controlled flowrates in a channel which is obtained by xurography: double sided tape is cut out with a cutter plotter and placed between the reactive window and a non-reactive support. While the calcite window reacts, its topography is measured in situ every 10 s using an interference microscope, with a pixel resolution of 4.9 μm and a vertical resolution of 50 nm. These experiments are also compared with reactive flow simulations done with Lattice Boltzmann methods. Then, we will present a dissolution experiment done with a microfluidic system that embeds chalk. In this experiment, the main flow takes place at the chalk surface, in contact with fluid flowing in a channel above the chalk sample. Thus the reaction mostly occurs at the surface of the sample. The reacting chalk surface is observed in situ by stereomicroscopy and by interferometry. The dissolution velocities are highly heterogeneous. To identify the mineral change of the surface, a posteriori measurements using field emission scanning electron microscopy (FE-SEM), and energy dispersive X-ray spectroscopy (EDS). [1] Neuville et al, 2016, Xurography for microfluidics on a reactive solid, Lab on Chip, DOI: 10.1039/c6lc01253a

  12. Use of microcomputer in mapping depth of stratigraphic horizons in National Petroleum Reserve in Alaska

    USGS Publications Warehouse

    Payne, Thomas G.

    1982-01-01

    REGIONAL MAPPER is a menu-driven system in the BASIC language for computing and plotting (1) time, depth, and average velocity to geologic horizons, (2) interval time, thickness, and interval velocity of stratigraphic intervals, and (3) subcropping and onlapping intervals at unconformities. The system consists of three programs: FILER, TRAVERSER, and PLOTTER. A control point is a shot point with velocity analysis or a shot point at or near a well with velocity check-shot survey. Reflection time to and code number of seismic horizons are filed by digitizing tablet from record sections. TRAVERSER starts at a point of geologic control and, in traversing to another, parallels seismic events, records loss of horizons by onlap and truncation, and stores reflection time for geologic horizons at traversed shot points. TRAVERSER is basically a phantoming procedure. Permafrost thickness and velocity variations, buried canyons with low-velocity fill, and error in seismically derived velocity cause velocity anomalies that complicate depth mapping. Two depths to the top of the pebble is based shale are computed for each control point. One depth, designated Zs on seismically derived velocity. The other (Zw) is based on interval velocity interpolated linearly between wells and multiplied by interval time (isochron) to give interval thickness. Z w is computed for all geologic horizons by downward summation of interval thickness. Unknown true depth (Z) to the pebble shale may be expressed as Z = Zs + es and Z = Zw + ew where the e terms represent error. Equating the two expressions gives the depth difference D = Zs + Zw = ew + es A plot of D for the top of the pebble shale is readily contourable but smoothing is required to produce a reasonably simple surface. Seismically derived velocity used in computing Zs includes the effect of velocity anomalies but is subject to some large randomly distributed errors resulting in depth errors (es). Well-derived velocity used in computing Zw does not include the effect of velocity anomalies, but the error (ew) should reflect these anomalies and should be contourable (non-random). The D surface as contoured with smoothing is assumed to represent ew, that is, the depth effect of variations in permafrost thickness and velocity and buried canyon depth. Estimated depth (Zest) to each geologic horizon is the sum of Z w for that horizon and a constant e w as contoured for the pebble shale, which is the first highly continuous seismic horizon below the zone of anomalous velocity. Results of this 'depthing' procedure are compared with those of Tetra Tech, Inc., the subcontractor responsible for geologic and geophysical interpretation and mapping.

  13. E2GPR - Edit your geometry, Execute GprMax2D and Plot the Results!

    NASA Astrophysics Data System (ADS)

    Pirrone, Daniele; Pajewski, Lara

    2015-04-01

    In order to predict correctly the Ground Penetrating Radar (GPR) response from a particular scenario, Maxwell's equations have to be solved, subject to the physical and geometrical properties of the considered problem and to its initial conditions. Several techniques have been developed in computational electromagnetics, for the solution of Maxwell's equations. These methods can be classified into two main categories: differential and integral equation solvers, which can be implemented in the time or spectral domain. All of the different methods present compromises between computational efficiency, stability, and the ability to model complex geometries. The Finite-Difference Time-Domain (FDTD) technique has several advantages over alternative approaches: it has inherent simplicity, efficiency and conditional stability; it is suitable to treat impulsive behavior of the electromagnetic field and can provide either ultra-wideband temporal waveforms or the sinusoidal steady-state response at any frequency within the excitation spectrum; it is accurate and highly versatile; and it has become a mature and well-researched technique. Moreover, the FDTD technique is suitable to be executed on parallel-processing CPU-based computers and to exploit the modern computer visualisation capabilities. GprMax [1] is a very well-known and largely validated FDTD software tool, implemented by A. Giannopoulos and available for free public download on www.gprmax.com, together with examples and a detailled user guide. The tool includes two electromagnetic wave simulators, GprMax2D and GprMax3D, for the full-wave simulation of two-dimensional and three-dimensional GPR models. In GprMax, everything can be done with the aid of simple commands that are used to define the model parameters and results to be calculated. These commands need to be entered in a simple ASCII text file. GprMax output files can be stored in ASCII or binary format. The software is provided with MATLAB functions, which can be employed to import synthetic data created by GprMax using the binary-format option into MATLAB, in order to be processed and/or visualized. Further MATLAB procedures for the visualization of GprMax synthetic data have been developed within the COST Action TU1208 [2] and are available for free public download on www.GPRadar.eu. The current version of GprMax3D is compiled with OpenMP, supporting multi-platform shared memory multiprocessing which allows GprMax3D to take advantage of multiple cores/CPUs. GprMax2D, instead, exploits a single core when executed. E2GPR is a new software tool, available free of charge for both academic and commercial use, conceived to: 1) assist in the creation, modification and analysis of GprMax2D models, through a Computer-Aided Design (CAD) system; 2) allow parallel and/or distributed computing with GprMax2D, on a network of computers; 3) automatically plot A-scans and B-scans generated by GprMax2D. The CAD and plotter parts of the tool are implemented in Java and can run on any Java Virtual Machine (JVM) regardless of computer architecture. The part of the tool devoted to supporting parallel and/or distributed computing, instead, requires the set up of a Web-Service (on a server emulator or server); in fact, it is currently configured only for Windows Server and Internet Information Services (IIS). In this work, E2GPR is presented and examples are provided which demonstrate its use. The tool can be currently obtained by contacting the authors. It will soon be possible to download it from www.GPRadar.eu. Acknowledgement This work is a contribution to the COST Action TU1208 'Civil Engineering Applications of Ground Penetrating Radar.' The authors thank COST for funding the Action TU1208. References [1] A. Giannopoulos, 'Modelling ground penetrating radar by GprMax,' Construction and Building Materials, vol. 19, pp. 755-762, 2005. [2] L. Pajewski, A. Benedetto, X. Dérobert, A. Giannopoulos, A. Loizos, G. Manacorda, M. Marciniak, C. Plati, G. Schettini, I. Trinks, "Applications of Ground Penetrating Radar in Civil Engineering - COST Action TU1208," Proc. 7th International Workshop on Advanced Ground Penetrating Radar (IWAGPR), 2-5 July 2013, Nantes, France, pp. 1-6.

  14. PLAID- A COMPUTER AIDED DESIGN SYSTEM

    NASA Technical Reports Server (NTRS)

    Brown, J. W.

    1994-01-01

    PLAID is a three-dimensional Computer Aided Design (CAD) system which enables the user to interactively construct, manipulate, and display sets of highly complex geometric models. PLAID was initially developed by NASA to assist in the design of Space Shuttle crewstation panels, and the detection of payload object collisions. It has evolved into a more general program for convenient use in many engineering applications. Special effort was made to incorporate CAD techniques and features which minimize the users workload in designing and managing PLAID models. PLAID consists of three major modules: the Primitive Object Generator (BUILD), the Composite Object Generator (COG), and the DISPLAY Processor. The BUILD module provides a means of constructing simple geometric objects called primitives. The primitives are created from polygons which are defined either explicitly by vertex coordinates, or graphically by use of terminal crosshairs or a digitizer. Solid objects are constructed by combining, rotating, or translating the polygons. Corner rounding, hole punching, milling, and contouring are special features available in BUILD. The COG module hierarchically organizes and manipulates primitives and other previously defined COG objects to form complex assemblies. The composite object is constructed by applying transformations to simpler objects. The transformations which can be applied are scalings, rotations, and translations. These transformations may be defined explicitly or defined graphically using the interactive COG commands. The DISPLAY module enables the user to view COG assemblies from arbitrary viewpoints (inside or outside the object) both in wireframe and hidden line renderings. The PLAID projection of a three-dimensional object can be either orthographic or with perspective. A conflict analysis option enables detection of spatial conflicts or collisions. DISPLAY provides camera functions to simulate a view of the model through different lenses. Other features include hardcopy plot generation, scaling and zoom options, distance tabulations, and descriptive text in different sizes and fonts. An object in the PLAID database is not just a collection of lines; rather, it is a true three-dimensional representation from which correct hidden line renditions can be computed for any specified eye point. The drawings produced in the various modules of PLAID can be stored in files for future use. The PLAID program product is available by license for a period of 10 years to domestic U.S. licensees. The licensed program product includes the PLAID source code, command procedures, sample applications, and one set of supporting documentation. Copies of the documentation may be purchased separately at the price indicated below. PLAID is written in FORTRAN 77 for single user interactive execution and has been implemented on a DEC VAX series computer operating under VMS with a recommended core memory of four megabytes. PLAID requires a Tektronix 4014 compatible graphics display terminal and optionally uses a Tektronix 4631 compatible graphics hardcopier. Plots of resulting PLAID displays may be produced using the Calcomp 960, HP 7221, or HP 7580 plotters. Digitizer tablets can also be supported. This program was developed in 1986.

  15. Monitoring of Northern dwarf novae for radio jets campaign

    NASA Astrophysics Data System (ADS)

    Waagen, Elizabeth O.

    2014-10-01

    Ms. Deanne Coppejans (PhD candidate, Radboud University Nijmegen (Netherlands) and University of Cape Town) and colleagues have requested AAVSO observer assistance in monitoring nine Northern dwarf novae in support of their campaign to observe them in outburst with the Very Large Array (VLA) to search for radio jets. They will observe 5 targets from the following list: U Gem*, EX Dra, Z Cam*, RX And*, EM Cyg, AB Dra, SY Cnc, SU UMa*, and YZ Cnc*. Stars with an asterisk (*) will be given higher priority. The campaign will begin now, starting with monitoring of RX And and EM Cyg, and run through September 2015, or until all five VLA triggers have been used. This campaign is similar to previous AAVSO campaigns, namely the 2007 campaign to monitor a sample of 10 dwarf novae (AAVSO Alert Notice 345), which resulted in the first detection of a radio jet in a dwarf nova system (Koerding et al. 2008, Science, 320, 1318), and the ones carried out at the request of Dr. James Miller-Jones and colleagues on SS Cyg in 2010-2011 (AAVSO Special Notices #204, #206, Alert Notice 445). The latter resulted in an accurate distance determination to SS Cyg, thereby reconciling its behavior with our understanding of accretion disc theory in accreting compact objects (Miller-Jones et al. 2013, Science, 340, 950). Ms. Coppejans writes: "The relation between accretion and outflow is one of the basic problems in modern astrophysics. It has long been thought that CVs are the only accreting systems that do not produce jets, and this notion has even been used to constrain jet models. However, there are now some indications that CVs do show jets, possibly allowing a universal link between accretion and ejection. Radio observations provide the best unambiguous tracer of the corresponding jet or directed outflow, but there are only two clear detections. By observing a more extensive sample of cataclysmic variables in outburst we will determine the existence of jets or other outflows in these accreting binary systems. These observations will decide if either CVs do show jets and thus support a universal link between accretion and ejection, or if they do not show jets, further constraining future jet models." The radio jet, if it exists in any of these nine systems, is expected to be seen shortly after the beginning of the outburst (as it was in SS Cyg). Catching the outburst as it is just starting and reporting that information to AAVSO HQ immediately is crucial, as the astronomers need to be alerted, make their decision whether to trigger the VLA observations, and allow enough time for the VLA to start the observations. Please observe these systems NIGHTLY (visual, CCD V) and report all observations as soon as is practical. In the event of an outburst, please report your observations as quickly as you can via WebObs, and also notify Dr. Matthew Templeton and Elizabeth Waagen at AAVSO Headquarters and Deanne Coppejans. Finder charts with sequence may be created using the AAVSO Variable Star Plotter (http://www.aavso.org/vsp). Observations should be submitted to the AAVSO International Database. See full Alert Notice for more details and information on the targets.

  16. Results of stainless steel canister corrosion studies and environmental sample investigations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryan, Charles R.; Enos, David

    2014-12-01

    This progress report describes work being done at Sandia National Laboratories (SNL) to assess the localized corrosion performance of container/cask materials used in the interim storage of used nuclear fuel. The work involves both characterization of the potential physical and chemical environment on the surface of the storage canisters and how it might evolve through time, and testing to evaluate performance of the canister materials under anticipated storage conditions. To evaluate the potential environment on the surface of the canisters, SNL is working with the Electric Power Research Institute (EPRI) to collect and analyze dust samples from the surface ofmore » in-service SNF storage canisters. In FY 13, SNL analyzed samples from the Calvert Cliffs Independent Spent Fuel Storage Installation (ISFSI); here, results are presented for samples collected from two additional near-marine ISFSI sites, Hope Creek NJ, and Diablo Canyon CA. The Hope Creek site is located on the shores of the Delaware River within the tidal zone; the water is brackish and wave action is normally minor. The Diablo Canyon site is located on a rocky Pacific Ocean shoreline with breaking waves. Two types of samples were collected: SaltSmart™ samples, which leach the soluble salts from a known surface area of the canister, and dry pad samples, which collected a surface salt and dust using a swipe method with a mildly abrasive ScotchBrite™ pad. The dry samples were used to characterize the mineralogy and texture of the soluble and insoluble components in the dust via microanalytical techniques, including mapping X-ray Fluorescence spectroscopy and Scanning Electron Microscopy. For both Hope Creek and Diablo Canyon canisters, dust loadings were much higher on the flat upper surfaces of the canisters than on the vertical sides. Maximum dust sizes collected at both sites were slightly larger than 20 μm, but Phragmites grass seeds ~1 mm in size, were observed on the tops of the Hope Creek canisters. At both sites, the surface dust could be divided into fractions generated by manufacturing processes and by natural processes. The fraction from manufacturing processes consisted of variably-oxidized angular and spherical particles of stainless steel and iron, generated by machining and welding/cutting processes, respectively. Dust from natural sources consisted largely of detrital quartz and aluminosilicates (feldspars and clays) at both sites. At Hope Creek, soluble salts were dominated by sulfates and nitrates, mostly of calcium. Chloride was a trace component and the only chloride mineral observed by SEM was NaCl. Chloride surface loads measured by the Saltsmart™ sensors were very low, less than 60 mg m –2 on the canister top, and less than 10 mg m –2 on the canister sides. At Diablo Canyon, sea-salt aggregates of NaCl and Mg-SO 4, with minor K and Ca, were abundant in the dust, in some cases dominating the observed dust assemblage. Measured Saltsmart™ chloride surface loads were very low (<5 mg m –2); however, high canister surface temperatures damaged the Saltsmart™ sensors, and, in view of the SEM observations of abundant sea-salts on the package surfaces, the measured surface loads may not be valid. Moreover, the more heavily-loaded canister tops at Diablo Canyon were not sampled with the Saltsmart™ sensors. The observed low surface loads do not preclude chloride-induced stress corrosion cracking (CISCC) at either site, because (1) the measured data may not be valid for the Diablo Canyon canisters; (2) the surface coverage was not complete (for instance, the 45º offset between the outlet and inlet vents means that near-inlet areas, likely to have heavier dust and salt loads, were not sampled); and (3) CISCC has been experimentally been observed at salt loads as low as 5-8 mg/m 2. Experimental efforts at SNL to assess corrosion of interim storage canister materials include three tasks in FY14. First, a full-diameter canister mockup, made using materials and techniques identical to those used to make interim storage canisters, was designed and ordered from Ranor Inc., a cask vendor for Areva/TN. The mockup will be delivered prior to the end of FY14, and will be used for evaluating weld residual stresses and degrees of sensitization for typical interim storage canister welds. Following weld characterization, the mockup will be sectioned and provided to participating organizations for corrosion testing purposes. A test plan is being developed for these efforts. In a second task, experimental work was carried out to evaluate crevice corrosion of 304SS in the presence of limited reactants, as would be present on a dustcovered storage canister. This work tests the theory that limited salt loads will limit corrosion penetration over time, and is a continuation of work carried out in FY13. Laser confocal microscopy was utilized to assess the volume and depth of corrosion pits formed during the crevice corrosion tests. Results indicate that for the duration of the current experiments (100 days), no stifling of corrosion occurred due to limitations in the amount of reactants present at three different salt loadings. Finally, work has been carried out this year perfecting an instrument for depositing sea-salts onto metal surfaces for atmospheric corrosion testing purposes. The system uses an X-Y plotter system with a commercial airbrush, and deposition is monitored with a quartz crystal microbalance. The system is capable of depositing very even salt loadings, even at very low total deposition rates.« less

  17. SARAH 3.2: Dirac gauginos, UFO output, and more

    NASA Astrophysics Data System (ADS)

    Staub, Florian

    2013-07-01

    SARAH is a Mathematica package optimized for the fast, efficient and precise study of supersymmetric models beyond the MSSM: a new model can be defined in a short form and all vertices are derived. This allows SARAH to create model files for FeynArts/FormCalc, CalcHep/CompHep and WHIZARD/O'Mega. The newest version of SARAH now provides the possibility to create model files in the UFO format which is supported by MadGraph 5, MadAnalysis 5, GoSam, and soon by Herwig++. Furthermore, SARAH also calculates the mass matrices, RGEs and 1-loop corrections to the mass spectrum. This information is used to write source code for SPheno in order to create a precision spectrum generator for the given model. This spectrum-generator-generator functionality as well as the output of WHIZARD and CalcHep model files has seen further improvement in this version. Also models including Dirac gauginos are supported with the new version of SARAH, and additional checks for the consistency of the implementation of new models have been created. Program summaryProgram title:SARAH Catalogue identifier: AEIB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIB_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3 22 411 No. of bytes in distributed program, including test data, etc.: 3 629 206 Distribution format: tar.gz Programming language: Mathematica. Computer: All for which Mathematica is available. Operating system: All for which Mathematica is available. Classification: 11.1, 11.6. Catalogue identifier of previous version: AEIB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 808 Does the new version supersede the previous version?: Yes, the new version includes all known features of the previous version but also provides the new features mentioned below. Nature of problem: To use Madgraph for new models it is necessary to provide the corresponding model files which include all information about the interactions of the model. However, the derivation of the vertices for a given model and putting those into model files which can be used with Madgraph is usually very time consuming. Dirac gauginos are not present in the minimal supersymmetric standard model (MSSM) or many extensions of it. Dirac mass terms for vector superfields lead to new structures in the supersymmetric (SUSY) Lagrangian (bilinear mass term between gaugino and matter fermion as well as new D-terms) and modify also the SUSY renormalization group equations (RGEs). The Dirac character of gauginos can change the collider phenomenology. In addition, they come with an extended Higgs sector for which a precise calculation of the 1-loop masses has not happened so far. Solution method: SARAH calculates the complete Lagrangian for a given model whose gauge sector can be any direct product of SU(N) gauge groups. The chiral superfields can transform as any, irreducible representation with respect to these gauge groups and it is possible to handle an arbitrary number of symmetry breakings or particle rotations. Also the gauge fixing is automatically added. Using this information, SARAH derives all vertices for a model. These vertices can be exported to model files in the UFO which is supported by Madgraph and other codes like GoSam, MadAnalysis or ALOHA. The user can also study models with Dirac gauginos. In that case SARAH includes all possible terms in the Lagrangian stemming from the new structures and can also calculate the RGEs. The entire impact of these terms is then taken into account in the output of SARAH to UFO, CalcHep, WHIZARD, FeynArts and SPheno. Reasons for new version: SARAH provides, with this version, the possibility of creating model files in the UFO format. The UFO format is supposed to become a standard format for model files which should be supported by many different tools in the future. Also models with Dirac gauginos were not supported in earlier versions. Summary of revisions: Support of models with Dirac gauginos. Output of model files in the UFO format, speed improvement in the output of WHIZARD model files, CalcHep output supports the internal diagonalization of mass matrices, output of control files for LHPC spectrum plotter, support of generalized PDG numbering scheme PDG.IX, improvement of the calculation of the decay widths and branching ratios with SPheno, the calculation of new low energy observables are added to the SPheno output, the handling of gauge fixing terms has been significantly simplified. Restrictions: SARAH can only derive the Lagrangian in an automatized way for N=1 SUSY models, but not for those with more SUSY generators. Furthermore, SARAH supports only renormalizable operators in the output of model files in the UFO format and also for CalcHep, FeynArts and WHIZARD. Also color sextets are not yet included in the model files for Monte Carlo tools. Dimension 5 operators are only supported in the calculation of the RGEs and mass matrices. Unusual features: SARAH does not need the Lagrangian of a model as input to calculate the vertices. The gauge structure, particle and content and superpotential as well as rotations stemming from gauge symmetry breaking are sufficient. All further information is derived by SARAH on its own. Therefore, the model files are very short and the implementation of new models is fast and easy. In addition, the implementation of a model can be checked for physical and formal consistency. In addition, SARAH can generate Fortran code for a full 1-loop analysis of the mass spectrum in the context for Dirac gauginos. Running time: Measured CPU time for the evaluation of the MSSM using a Lenovo Thinkpad X220 with i7 processor (2.53 GHz). Calculating the complete Lagrangian: 9 s. Calculating all vertices: 51 s. Output of the UFO model files: 49 s.

  18. WE-G-16A-01: Evolution of Radiation Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rothenberg, L; Mohan, R; Van Dyk, J

    Welcome and Introduction - Lawrence N. Rothenberg This symposium is one a continuing series of presentations at AAPM Annual Meetings on the historical aspects of medical physics, radiology, and radiation oncology that have been organized by the AAPM History Committee. Information on previous presentations including “Early Developments in Teletherapy” (Indianapolis 2013), “Historical Aspects of Cross-Sectional Imaging” (Charlotte 2012), “Historical Aspects of Brachytherapy” (Vancouver 2011), “50 Years of Women in Medical Physics” (Houston 2008), and “Roentgen's Early Investigations” (Minneapolis 2007) can be found in the Education Section of the AAPM Website. The Austin 2014 History Symposium will be on “Evolution ofmore » Radiation Treatment Planning.” Overview - Radhe Mohan Treatment planning is one of the most critical components in the chain of radiation therapy of cancers. Treatment plans of today contain a wide variety of sophisticated information conveying the potential clinical effectiveness of the designed treatment to practitioners. Examples of such information include dose distributions superimposed on three- or even four-dimensional anatomic images; dose volume histograms, dose, dose-volume and dose-response indices for anatomic structures of interest; etc. These data are used for evaluating treatment plans and for making treatment decisions. The current state-of-the-art has evolved from the 1940s era when the dose to the tumor and normal tissues was estimated approximately by manual means. However, the symposium will cover the history of the field from the late-1950's, when computers were first introduced for treatment planning, to the present state involving the use of high performance computing and advanced multi-dimensional anatomic, functional and biological imaging, focusing only on external beam treatment planning. The symposium will start with a general overview of the treatment planning process including imaging, structure delineation, assignment of dose requirements, consideration of uncertainties, selection of beam configurations and shaping of beams, and calculations, optimization and evaluation of dose distributions. This will be followed by three presentations covering the evolution of treatment planning, which parallels the evolution of computers, availability of advanced volumetric imaging and the development of novel technologies such as dynamic multi-leaf collimators and online image guidance. This evolution will be divided over three distinct periods - prior to 1970's, the 2D era; from 1980 to the mid-1990's, the 3D era; and from the mid 1990's to today, the IMRT era. When the World was Flat: The Two-Dimensional Radiation Therapy Era” - Jacob Van Dyk In the 2D era, anatomy was defined with the aid of solder wires, special contouring devices and projection x-rays. Dose distributions were calculated manually from single field, flat surface isodoses on transparencies. Precalculated atlases of generic dose distributions were produced by the International Atomic Energy Agency. Massive time-shared main frames and mini-computers were used to compute doses at individual points or dose distributions in a single plane. Beam shapes were generally rectangular, with wedges, missing tissue compensators and occasional blocks to shield critical structures. Dose calculations were measurement-based or they used primary and scatter calculations based on scatter-air ratio methodologies. Dose distributions were displayed on line printers as alpha-numeric character maps or isodose patterns made with pen plotters. More than Pretty Pictures: 3D Treatment Planning and Conformal Therapy - Benedick A. Fraass The introduction of computed tomography allowed the delineation of anatomy three-dimensionally and, supported partly by contracts from the National Cancer Institute, made possible the introduction and clinical use of 3D treatment planning, leading to development and use of 3D conformal therapy in the 1980's. 3D computer graphics and 3D anatomical structure definitions made possible Beam's Eye View (BEV) displays, making conformal beam shaping and much more sophisticated beam arrangements possible. These conformal plans significantly improved target dose coverage as well as normal tissue sparing. The use of dose volume histograms, gross/clinical/planning target volumes, MRI and PET imaging, multileaf collimators, and computer-controlled treatment delivery made sophisticated planning approaches practical. The significant improvements in dose distributions and analysis achievable with 3D conformal therapy made possible formal dose escalation and normal tissue tolerance clinical studies that set new and improved expectations for improved local control and decreasing complications in many clinical sites. From the Art to the State of the Art: Inverse Planning and IMRT - Thomas R. Bortfeld While the potential of intensity modulation was recognized in the mid- 1980's, intensity-modulated radiotherapy (IMRT) did not become a reality until the mid-1990's. Broad beams of photons could be sub-divided into narrow beamlets whose intensities could be determined using sophisticated optimization algorithms to appropriately balance tumor dose with normal tissue sparing. The development of dynamic multi-leaf collimators (on conventional linear accelerators as well as in helical delivery devices) enabled the efficient delivery of IMRT. The evolution of IMRT planning is continuing in the form of Volumetric Modulated Arc Therapy (VMAT) and through advanced optimization tools, such as multi-criteria optimization, automated IMRT planning, and robust optimization to protect dose distributions against uncertainties. IMRT also facilitates “dose painting” in which different sub-volumes of the target are prescribed different doses. Clearly, these advancements are being made possible by the increasing power and lower cost of computers and developments in other fields such as imaging and operations research. Summary - Radhe Mohan The history does not end here. The advancement of treatment planning is expected to continue, leading to further automation and improvements in conformality and robustness of dose distributions, particularly in the area of particle therapy. Radiobiological modeling will gain emphasis as part of the planning process. Learning Objectives: The scope of changes in technology and the capabilities of radiation treatment planning The impact of these changes in the quality of treatment plans and optimality of dose distributions The impact of development in other fields (imaging, computers, operations research, etc.) on the evolution of radiation treatment planning.« less

  19. CFSv2 Seasonal Climate Forecasts

    Science.gov Websites

    Nino3.4 Nino4 E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) E1 (data) E2 (data) E3 (data) Sea surface height and

  20. DataHub: Knowledge-based data management for data discovery

    NASA Astrophysics Data System (ADS)

    Handley, Thomas H.; Li, Y. Philip

    1993-08-01

    Currently available database technology is largely designed for business data-processing applications, and seems inadequate for scientific applications. The research described in this paper, the DataHub, will address the issues associated with this shortfall in technology utilization and development. The DataHub development is addressing the key issues in scientific data management of scientific database models and resource sharing in a geographically distributed, multi-disciplinary, science research environment. Thus, the DataHub will be a server between the data suppliers and data consumers to facilitate data exchanges, to assist science data analysis, and to provide as systematic approach for science data management. More specifically, the DataHub's objectives are to provide support for (1) exploratory data analysis (i.e., data driven analysis); (2) data transformations; (3) data semantics capture and usage; analysis-related knowledge capture and usage; and (5) data discovery, ingestion, and extraction. Applying technologies that vary from deductive databases, semantic data models, data discovery, knowledge representation and inferencing, exploratory data analysis techniques and modern man-machine interfaces, DataHub will provide a prototype, integrated environement to support research scientists' needs in multiple disciplines (i.e. oceanography, geology, and atmospheric) while addressing the more general science data management issues. Additionally, the DataHub will provide data management services to exploratory data analysis applications such as LinkWinds and NCSA's XIMAGE.

  1. Automated Data Submission for the Data Center

    NASA Astrophysics Data System (ADS)

    Wright, D.; Beaty, T.; Wei, Y.; Shanafield, H.; Santhana Vannan, S. K.

    2014-12-01

    Data centers struggle with difficulties related to data submission. Data are acquired through many avenues by many people. Many data submission activities involve intensive manual processes. During the submission process, data end up on varied storage devices. The situation can easily become chaotic. Collecting information on the status of pending data sets is arduous. For data providers, the submission process can be inconsistent and confusing. Scientists generally provide data from previous projects, and archival can be a low priority. Incomplete or poor documentation accompanies many data sets. However, complicated questionnaires deter busy data providers. At the ORNL DAAC, we have semi-automated the data set submission process to create a uniform data product and provide a consistent data provider experience. The formalized workflow makes archival faster for the data center and data set submission easier for data providers. Software modules create a flexible, reusable submission package. Formalized data set submission provides several benefits to the data center. A single data upload area provides one point of entry and ensures data are stored in a consistent location. A central dashboard records pending data set submissions in a single table and simplifies reporting. Flexible role management allows team members to readily coordinate and increases efficiency. Data products and metadata become uniform and easily maintained. As data and metadata standards change, modules can be modified or re-written without affecting workflow. While each data center has unique challenges, the data ingestion process is generally the same: get data from the provider, scientist, or project and capture metadata pertinent to that data. The ORNL DAAC data set submission workflow and software modules can be reused entirely or in part by other data centers looking for a data set submission solution. These data set submission modules will be available on NASA's Earthdata Code Collaborative and by request.

  2. KNMI DataLab experiences in serving data-driven innovations

    NASA Astrophysics Data System (ADS)

    Noteboom, Jan Willem; Sluiter, Raymond

    2016-04-01

    Climate change research and innovations in weather forecasting rely more and more on (Big) data. Besides increasing data from traditional sources (such as observation networks, radars and satellites), the use of open data, crowd sourced data and the Internet of Things (IoT) is emerging. To deploy these sources of data optimally in our services and products, KNMI has established a DataLab to serve data-driven innovations in collaboration with public and private sector partners. Big data management, data integration, data analytics including machine learning and data visualization techniques are playing an important role in the DataLab. Cross-domain data-driven innovations that arise from public-private collaborative projects and research programmes can be explored, experimented and/or piloted by the KNMI DataLab. Furthermore, advice can be requested on (Big) data techniques and data sources. In support of collaborative (Big) data science activities, scalable environments are offered with facilities for data integration, data analysis and visualization. In addition, Data Science expertise is provided directly or from a pool of internal and external experts. At the EGU conference, gained experiences and best practices are presented in operating the KNMI DataLab to serve data-driven innovations for weather and climate applications optimally.

  3. NSSDC Data listing

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A convenient reference to space science and supportive data available from the National Space Science Data Center (NSSDC) is provided. Satellite data are organized by NSSDC spacecraft common name. The launch date and NSSDC ID are given. Experiments are listed alphabetically by the principal investigator or team leader. The experiment name and NSSDC ID, data set ID, data set name, data form code, quantity of data, and the time span of the data as verified by NSSDC are shown. Ground-based data, models, computer routines, and composite spacecraft data that are available from NSSDC are listed alphabetically by discipline, source, data type, data content, and data set. The data set name, data form code, quantity of data, and the time span covered where appropriate are included.

  4. National Space Science Data Center (NSSDC) Data Listing

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Satellite and nonsatellite data available from the National Space Science Data Center are listed. The Satellite Data listing includes the spacecraft name, launch date, and an alphabetical list of experiments. The Non-Satellite Data listing contains ground based data, models, computer routines, and composite spacecraft data. The data set name, data form code, quantity of data, and the time space covered are included in the data sets of both listings where appropriate. Geodetic tracking data sets are also included.

  5. NSSDC data listing

    NASA Technical Reports Server (NTRS)

    Horowitz, Richard

    1991-01-01

    The purpose here is to identify, in a highly summarized way, data available from the National Space Science Data Center (NSSDC). Most data are maintained as offline data sets gathered from individual instruments carried on spacecraft; these comprise the Satellite Data Listing. Descriptive names, time spans, data form, and quality of these data sets are identified in the listing, which is sorted alphabetically, first by spacecraft name and then by the principal investigator's or team leader's last name. Several data sets not associated with individual spaceflight instruments are identified in separate listings following the Satellite Data Listing. These include composite spacecraft data sets, ground based data, models, and computer routines. NSSDC also offers data via special services and systems in a number of areas, including the Astronomical Data Center, Coordinated Data Analysis Workshops, NASA Climate Data System, Pilot Land Data System, and Crustal Dynamics Data Information System.

  6. US GeoData Available Through the Internet

    USGS Publications Warehouse

    ,

    2000-01-01

    The U.S. Geological Survey (USGS) offers certain US GeoData data sets through the Internet. They can be retrieved using the World Wide Web or anonymous File Transfer Protocol (FTP). The data bases and their directory paths are as follows: * 1:24,000-scale digital line graph data in SDTS format (/pub/data/DLG/24K) * 1:2,000,000-scale digital line graph data in SDTS format (/pub/data/DLG/2M) * 1:100,000-scale digital line graph data (/pub/data/DLG/100K) * 1:100,000-scale land use and land cover data (/pub/data/LULC/100K) * 1:250,000-scale land use and land cover data (/pub/data/LULC/250K) * 1:24,000-scale digital elevation data (/pub/data/DEM/7.5min) * 1-degree digital elevation model data (/pub/data/DEM/250)

  7. Environmental Data Store: A Web-Based System Providing Management and Exploitation for Multi-Data-Type Environmental Data

    NASA Astrophysics Data System (ADS)

    Ji, P.; Piasecki, M.

    2012-12-01

    With the rapid growth in data volumes, data diversity and data demands from multi-disciplinary research effort, data management and exploitation are increasingly facing significant challenges for environmental scientific community. We describe Environmental data store (EDS), a system we are developing that is a web-based system following an open source implementation to manage and exploit multi-data-type environmental data. EDS provides repository services for the six fundamental data types, which meet the demands of multi-disciplinary environmental research. These data types are: a) Time Series Data, b) GeoSpatial data, c) Digital Data, d) Ex-Situ Sampling data, e) Modeling Data, f) Raster Data. Through data portal, EDS allows for efficient consuming these six types of data placed in data pool, which is made up of different data nodes corresponding to different data types, including iRODS, ODM, THREADS, ESSDB, GeoServer, etc.. EDS data portal offers unified submission interface for the above different data types; provides fully integrated, scalable search across content from the above different data systems; also features mapping, analysis, exporting and visualization, through integration with other software. EDS uses a number of developed systems, follows widely used data standards, and highlights the thematic, semantic, and syntactic support on the submission and search, in order to advance multi-disciplinary environmental research. This system will be installed and develop at the CrossRoads initiative at the City College of New York.

  8. Alternative Fuels Data Center: Data Downloads

    Science.gov Websites

    Data Downloads to someone by E-mail Share Alternative Fuels Data Center: Data Downloads on Facebook Tweet about Alternative Fuels Data Center: Data Downloads on Twitter Bookmark Alternative Fuels Data Center: Data Downloads on Google Bookmark Alternative Fuels Data Center: Data Downloads on Delicious Rank

  9. US GeoData Available Through the Internet

    USGS Publications Warehouse

    ,

    2000-01-01

    The U.S. Geological Survey (USGS) offers certain US GeoData data sets through the Internet. They can be retrieved using the World Wide Web or anonymous File Transfer Protocol (FTP). The data bases and their directory paths are as follows: * 1:24,000-scale digital line graph data in SDTS format (/pub/data/DLG/24K) * 1:2,000,000-scale digital line graph data in SDTS format (/pub/data/DLG/2M) * 1:100,000-scale digital line graph data (/pub/data/DLG/100K) * 1:100,000-scale land use and land cover data (/pub/data/LULC/100K) * 1:250,000-scale land use and land cover data (/pub/data/LULC/250K) * 1-degree digital elevation model data (/pub/data/DEM/250)

  10. Datalist: A Value Added Service to Enable Easy Data Selection

    NASA Technical Reports Server (NTRS)

    Li, Angela; Hegde, Mahabaleshwa; Bryant, Keith; Seiler, Edward; Shie, Chung-Lin; Teng, William; Liu, Zhong; Hearty, Thomas; Shen, Suhung; Kempler, Steven; hide

    2016-01-01

    Imagine a user wanting to study hurricane events. This could involve searching and downloading multiple data variables from multiple data sets. The currently available services from the Goddard Earth Sciences Data and Information Services Center (GES DISC) only allow the user to select one data set at a time. The GES DISC started a Data List initiative, in order to enable users to easily select multiple data variables. A Data List is a collection of predefined or user-defined data variables from one or more archived data sets. Target users of Data Lists include science teams, individual science researchers, application users, and educational users. Data Lists are more than just data. Data Lists effectively provide users with a sophisticated integrated data and services package, including metadata, citation, documentation, visualization, and data-specific services, all available from one-stop shopping. Data Lists are created based on the software architecture of the GES DISC Unified User Interface (UUI). The Data List service is completely data-driven, and a Data List is treated just as any other data set. The predefined Data Lists, created by the experienced GES DISC science support team, should save a significant amount of time that users would otherwise have to spend.

  11. DataONE: A Distributed Environmental and Earth Science Data Network Supporting the Full Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Cook, R.; Michener, W.; Vieglais, D.; Budden, A.; Koskela, R.

    2012-04-01

    Addressing grand environmental science challenges requires unprecedented access to easily understood data that cross the breadth of temporal, spatial, and thematic scales. Tools are needed to plan management of the data, discover the relevant data, integrate heterogeneous and diverse data, and convert the data to information and knowledge. Addressing these challenges requires new approaches for the full data life cycle of managing, preserving, sharing, and analyzing data. DataONE (Observation Network for Earth) represents a virtual organization that enables new science and knowledge creation through preservation and access to data about life on Earth and the environment that sustains it. The DataONE approach is to improve data collection and management techniques; facilitate easy, secure, and persistent storage of data; continue to increase access to data and tools that improve data interoperability; disseminate integrated and user-friendly tools for data discovery and novel analyses; work with researchers to build intuitive data exploration and visualization tools; and support communities of practice via education, outreach, and stakeholder engagement.

  12. Challenges in sharing of geospatial data by data custodians in South Africa

    NASA Astrophysics Data System (ADS)

    Kay, Sissiel E.

    2018-05-01

    As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.

  13. What Does it Mean to Publish Data in Earth System Science Data Journal?

    NASA Astrophysics Data System (ADS)

    Carlson, D.; Pfeiffenberger, H.

    2015-12-01

    The availability of more than 120 data sets in ESSD represents an unprecedented effort by providers, data centers and ESSD. ESSD data sets and their accompanying data descriptions undergo rigorous review. The data sets reside at any of more than 20 cooperating data centers. The ESSD publication process depends on but challenges the concepts of digital object identification and exacerbates the varied interpretations of the phrase 'data publication'. ESSD adopts the digital object identifier (doi). Key questions apply to doi's and other identifiers. How will persistent identifiers point accurately to distributed or replicated data? How should data centers and data publishers use identifier technologies to ensure authenticity and integrity? Should metadata associated with identifiers distinguish among raw, quality controlled and derived data processing levels, or indicate license or copyright status?Data centers publish data sets according to internal metadata standards but without indicators of quality control. Publication in this sense indicates availability. National data portals compile, serve and publish data products as a service to national researchers and, often, to meet national requirements. Publication in this second case indicates availability in a national context; the data themselves may still reside at separate data centers. Data journals such as ESSD or Scientific Data publish peer-reviewed, quality controlled data sets. These data sets almost always reside at a separate data center - the journal and the center maintain explicit identifier linkages. Data journals add quality to the feature of availability. A single data set processed through these layers will generate three independent doi's but the doi's will provide little information about availability or quality. Could the data world learn from the URL world to consider additions? Suffixes? Could we use our experience with processing levels or data maturity to propose and agree such extensions?

  14. Web-based data acquisition and management system for GOSAT validation Lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra N.; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2012-11-01

    An web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data analysis is developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS ground-level meteorological data, Rawinsonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data.

  15. Understanding the Data Complexity continuum to reduce data management costs and increase data usability through partnerships with the National Centers for Environmental Information

    NASA Astrophysics Data System (ADS)

    Mesick, S.; Weathers, K. W.

    2017-12-01

    Data complexity can be seen as a continuum from complex to simple. The term data complexity refers to data collections that are disorganized, poorly documented, and generally do not follow best data management practices. Complex data collections are challenging and expensive to manage. Simplified collections readily support automated archival processes, enhanced discovery and data access, as well as production of services that make data easier to reuse. In this session, NOAA NCEI scientific data stewards will discuss the data complexity continuum. This talk will explore data simplification concepts, methods, and tools that data managers can employ which may offer more control over data management costs and processes, while achieving policy goals for open data access and ready reuse. Topics will include guidance for data managers on best allocation of limited data management resources; models for partnering with NCEI to accomplish shared data management goals; and will demonstrate through case studies the benefits of investing in documentation, accessibility, and services to increase data value and return on investment.

  16. Definitions of components of the master water data index maintained by the National Water Data Exchange

    USGS Publications Warehouse

    Perry, R.A.; Williams, O.O.

    1982-01-01

    The Master Water Data Index is a computerized data base developed and maintained by the National Water Data Exchange (NAWDEX). The Index contains information about water-data collection sites. This information includes: the identification of new sites for which water data are available, the locations of these sites, the type of site, the data-collection organization, the types of data available, the major water-data parameters for which data are available, the frequency at which these parameters are measured, the period of time for which data are available, and the medial in which the data are stored. This document, commonly referred to as the MWDI data dictionary, contains a definition and description of each component of the Master Water Data Index data base. (USGS)

  17. Environmental System Science Data Infrastructure for a Virtual Ecosystem (ESS-DIVE) - A New U.S. DOE Data Archive

    NASA Astrophysics Data System (ADS)

    Agarwal, D.; Varadharajan, C.; Cholia, S.; Snavely, C.; Hendrix, V.; Gunter, D.; Riley, W. J.; Jones, M.; Budden, A. E.; Vieglais, D.

    2017-12-01

    The ESS-DIVE archive is a new U.S. Department of Energy (DOE) data archive designed to provide long-term stewardship and use of data from observational, experimental, and modeling activities in the earth and environmental sciences. The ESS-DIVE infrastructure is constructed with the long-term vision of enabling broad access to and usage of the DOE sponsored data stored in the archive. It is designed as a scalable framework that incentivizes data providers to contribute well-structured, high-quality data to the archive and that enables the user community to easily build data processing, synthesis, and analysis capabilities using those data. The key innovations in our design include: (1) application of user-experience research methods to understand the needs of users and data contributors; (2) support for early data archiving during project data QA/QC and before public release; (3) focus on implementation of data standards in collaboration with the community; (4) support for community built tools for data search, interpretation, analysis, and visualization tools; (5) data fusion database to support search of the data extracted from packages submitted and data available in partner data systems such as the Earth System Grid Federation (ESGF) and DataONE; and (6) support for archiving of data packages that are not to be released to the public. ESS-DIVE data contributors will be able to archive and version their data and metadata, obtain data DOIs, search for and access ESS data and metadata via web and programmatic portals, and provide data and metadata in standardized forms. The ESS-DIVE archive and catalog will be federated with other existing catalogs, allowing cross-catalog metadata search and data exchange with existing systems, including DataONE's Metacat search. ESS-DIVE is operated by a multidisciplinary team from Berkeley Lab, the National Center for Ecological Analysis and Synthesis (NCEAS), and DataONE. The primarily data copies are hosted at DOE's NERSC supercomputing facility with replicas at DataONE nodes.

  18. Liquid cooled data center design selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chainer, Timothy J.; Iyengar, Madhusudan K.; Parida, Pritish R.

    Input data, specifying aspects of a thermal design of a liquid cooled data center, is obtained. The input data includes data indicative of ambient outdoor temperature for a location of the data center; and/or data representing workload power dissipation for the data center. The input data is evaluated to obtain performance of the data center thermal design. The performance includes cooling energy usage; and/or one pertinent temperature associated with the data center. The performance of the data center thermal design is output.

  19. Geodynamics branch data base for main magnetic field analysis

    NASA Technical Reports Server (NTRS)

    Langel, Robert A.; Baldwin, R. T.

    1991-01-01

    The data sets used in geomagnetic field modeling at GSFC are described. Data are measured and obtained from a variety of information and sources. For clarity, data sets from different sources are categorized and processed separately. The data base is composed of magnetic observatory data, surface data, high quality aeromagnetic, high quality total intensity marine data, satellite data, and repeat data. These individual data categories are described in detail in a series of notebooks in the Geodynamics Branch, GSFC. This catalog reviews the original data sets, the processing history, and the final data sets available for each individual category of the data base and is to be used as a reference manual for the notebooks. Each data type used in geomagnetic field modeling has varying levels of complexity requiring specialized processing routines for satellite and observatory data and two general routines for processing aeromagnetic, marine, land survey, and repeat data.

  20. Trends in Planetary Data Analysis. Executive summary of the Planetary Data Workshop

    NASA Technical Reports Server (NTRS)

    Evans, N.

    1984-01-01

    Planetary data include non-imaging remote sensing data, which includes spectrometric, radiometric, and polarimetric remote sensing observations. Also included are in-situ, radio/radar data, and Earth based observation. Also discussed is development of a planetary data system. A catalog to identify observations will be the initial entry point for all levels of users into the data system. There are seven distinct data support services: encyclopedia, data index, data inventory, browse, search, sample, and acquire. Data systems for planetary science users must provide access to data, process, store, and display data. Two standards will be incorporated into the planetary data system: Standard communications protocol and Standard format data unit. The data system configuration must combine a distributed system with those of a centralized system. Fiscal constraints have made prioritization important. Activities include saving previous mission data, planning/cost analysis, and publishing of proceedings.

  1. Data sharing platforms for de-identified data from human clinical trials.

    PubMed

    Huser, Vojtech; Shmueli-Blumberg, Dikla

    2018-04-01

    Data sharing of de-identified individual participant data is being adopted by an increasing number of sponsors of human clinical trials. In addition to standardizing data syntax for shared trial data, semantic integration of various data elements is the focus of several initiatives that define research common data elements. This perspective article, in the first part, compares several data sharing platforms for de-identified clinical research data in terms of their size, policies and supported features. In the second part, we use a case study approach to describe in greater detail one data sharing platform (Data Share from National Institute of Drug Abuse). We present data on the past use of the platform, data formats offered, data de-identification approaches and its use of research common data elements. We conclude with a summary of current and expected future trends that facilitate secondary research use of data from completed human clinical trials.

  2. Development and Implementation of Production Area of Agricultural Product Data Collection System Based on Embedded System

    NASA Astrophysics Data System (ADS)

    Xi, Lei; Guo, Wei; Che, Yinchao; Zhang, Hao; Wang, Qiang; Ma, Xinming

    To solve problems in detecting the origin of agricultural products, this paper brings about an embedded data-based terminal, applies middleware thinking, and provides reusable long-range two-way data exchange module between business equipment and data acquisition systems. The system is constructed by data collection node and data center nodes. Data collection nodes taking embedded data terminal NetBoxII as the core, consisting of data acquisition interface layer, controlling information layer and data exchange layer, completing the data reading of different front-end acquisition equipments, and packing the data TCP to realize the data exchange between data center nodes according to the physical link (GPRS / CDMA / Ethernet). Data center node consists of the data exchange layer, the data persistence layer, and the business interface layer, which make the data collecting durable, and provide standardized data for business systems based on mapping relationship of collected data and business data. Relying on public communications networks, application of the system could establish the road of flow of information between the scene of origin certification and management center, and could realize the real-time collection, storage and processing between data of origin certification scene and databases of certification organization, and could achieve needs of long-range detection of agricultural origin.

  3. The creation, management, and use of data quality information for life cycle assessment.

    PubMed

    Edelen, Ashley; Ingwersen, Wesley W

    2018-04-01

    Despite growing access to data, questions of "best fit" data and the appropriate use of results in supporting decision making still plague the life cycle assessment (LCA) community. This discussion paper addresses revisions to assessing data quality captured in a new US Environmental Protection Agency guidance document as well as additional recommendations on data quality creation, management, and use in LCA databases and studies. Existing data quality systems and approaches in LCA were reviewed and tested. The evaluations resulted in a revision to a commonly used pedigree matrix, for which flow and process level data quality indicators are described, more clarity for scoring criteria, and further guidance on interpretation are given. Increased training for practitioners on data quality application and its limits are recommended. A multi-faceted approach to data quality assessment utilizing the pedigree method alongside uncertainty analysis in result interpretation is recommended. A method of data quality score aggregation is proposed and recommendations for usage of data quality scores in existing data are made to enable improved use of data quality scores in LCA results interpretation. Roles for data generators, data repositories, and data users are described in LCA data quality management. Guidance is provided on using data with data quality scores from other systems alongside data with scores from the new system. The new pedigree matrix and recommended data quality aggregation procedure can now be implemented in openLCA software. Additional ways in which data quality assessment might be improved and expanded are described. Interoperability efforts in LCA data should focus on descriptors to enable user scoring of data quality rather than translation of existing scores. Developing and using data quality indicators for additional dimensions of LCA data, and automation of data quality scoring through metadata extraction and comparison to goal and scope are needed.

  4. Early Citability of Data vs Peer-Review like Data Publishing Procedures

    NASA Astrophysics Data System (ADS)

    Stockhause, Martina; Höck, Heinke; Toussaint, Frank; Lautenschlager, Michael

    2014-05-01

    The World Data Center for Climate (WDCC) hosted at the German Climate Computing Center (DKRZ) was one of the first data centers, which established a peer-review like data publication procedure resulting in DataCite DOIs. Data in the long-term archive (LTA) is diligently reviewed by data managers and data authors to grant high quality and widely reusability of the published data. This traditional data publication procedure for LTA data bearing DOIs is very time consuming especially for WDCC's high data volumes of climate model data in the order of multiple TBytes. Data is shared with project members and selected scientists months before the data is long-term archived. The scientific community analyses and thus reviews the data leading to data quality improvements. Scientists wish to cite these unstable data in scientific publications before the long-term archiving and the thorough data review process are finalized. A concept for early preprint DOIs for shared but not yet long-term archived data is presented. Requirements on data documentation, persistence and quality and use cases for preprint DOIs within the data life-cycle are discussed as well as questions of how to document the differences of the two DOI types and how to relate them to each other with the recommendation to use LTA DOIs in citations. WDCC wants to offer an additional user service for early citations of data of basic quality without compromising the LTA DOIs, i.e. WDCC's standard DOIs, as trustworthy indicator for high quality data. Referencing Links: World Data Center for Climate (WDCC): http://www.wdc-climate.de German Climate Computing Center (DKRZ): http://www.dkrz.de DataCite: http://datacite.org

  5. Data Citation Concept for CMIP6

    NASA Astrophysics Data System (ADS)

    Stockhause, M.; Toussaint, F.; Lautenschlager, M.; Lawrence, B.

    2015-12-01

    There is a broad consensus among data centers and scientific publishers on Force 11's 'Joint Declaration of Data Citation Principles'. To put these principles into operation is not always as straight forward. The focus for CMIP6 data citations lies on the citation of data created by others and used in an analysis underlying the article. And for this source data usually no article of the data creators is available ('stand-alone data publication'). The planned data citation granularities are model data (data collections containing all datasets provided for the project by a single model) and experiment data (data collections containing all datasets for a scientific experiment run by a single model). In case of large international projects or activities like CMIP, the data is commonly stored and disseminated by multiple repositories in a federated data infrastructure such as the Earth System Grid Federation (ESGF). The individual repositories are subject to different institutional and national policies. A Data Management Plan (DMP) will define a certain standard for the repositories including data handling procedures. Another aspect of CMIP data, relevant for data citations, is its dynamic nature. For such large data collections, datasets are added, revised and retracted for years, before the data collection becomes stable for a data citation entity including all model or simulation data. Thus, a critical issue for ESGF is data consistency, requiring thorough dataset versioning to enable the identification of the data collection in the cited version. Currently, the ESGF is designed for accessing the latest dataset versions. Data citation introduces the necessity to support older and retracted dataset versions by storing metadata even beyond data availability (data unpublished in ESGF). Apart from ESGF, other infrastructure components exist for CMIP, which provide information that has to be connected to the CMIP6 data, e.g. ES-DOC providing information on models and simulations and the IPCC Data Distribution Centre (DDC) storing a subset of data together with available metadata (ES-DOC) for the long-term reuse of the interdisciplinary community. Other connections exist to standard project vocabularies, to personal identifiers (e.g. ORCID), or to data products (including provenance information).

  6. Critique and Contribute: A Practice-Based Framework for Improving Critical Data Studies and Data Science.

    PubMed

    Neff, Gina; Tanweer, Anissa; Fiore-Gartland, Brittany; Osburn, Laura

    2017-06-01

    What would data science look like if its key critics were engaged to help improve it, and how might critiques of data science improve with an approach that considers the day-to-day practices of data science? This article argues for scholars to bridge the conversations that seek to critique data science and those that seek to advance data science practice to identify and create the social and organizational arrangements necessary for a more ethical data science. We summarize four critiques that are commonly made in critical data studies: data are inherently interpretive, data are inextricable from context, data are mediated through the sociomaterial arrangements that produce them, and data serve as a medium for the negotiation and communication of values. We present qualitative research with academic data scientists, "data for good" projects, and specialized cross-disciplinary engineering teams to show evidence of these critiques in the day-to-day experience of data scientists as they acknowledge and grapple with the complexities of their work. Using ethnographic vignettes from two large multiresearcher field sites, we develop a set of concepts for analyzing and advancing the practice of data science and improving critical data studies, including (1) communication is central to the data science endeavor; (2) making sense of data is a collective process; (3) data are starting, not end points, and (4) data are sets of stories. We conclude with two calls to action for researchers and practitioners in data science and critical data studies alike. First, creating opportunities for bringing social scientific and humanistic expertise into data science practice simultaneously will advance both data science and critical data studies. Second, practitioners should leverage the insights from critical data studies to build new kinds of organizational arrangements, which we argue will help advance a more ethical data science. Engaging the insights of critical data studies will improve data science. Careful attention to the practices of data science will improve scholarly critiques. Genuine collaborative conversations between these different communities will help push for more ethical, and better, ways of knowing in increasingly datum-saturated societies.

  7. Critique and Contribute: A Practice-Based Framework for Improving Critical Data Studies and Data Science

    PubMed Central

    Neff, Gina; Tanweer, Anissa; Fiore-Gartland, Brittany; Osburn, Laura

    2017-01-01

    Abstract What would data science look like if its key critics were engaged to help improve it, and how might critiques of data science improve with an approach that considers the day-to-day practices of data science? This article argues for scholars to bridge the conversations that seek to critique data science and those that seek to advance data science practice to identify and create the social and organizational arrangements necessary for a more ethical data science. We summarize four critiques that are commonly made in critical data studies: data are inherently interpretive, data are inextricable from context, data are mediated through the sociomaterial arrangements that produce them, and data serve as a medium for the negotiation and communication of values. We present qualitative research with academic data scientists, “data for good” projects, and specialized cross-disciplinary engineering teams to show evidence of these critiques in the day-to-day experience of data scientists as they acknowledge and grapple with the complexities of their work. Using ethnographic vignettes from two large multiresearcher field sites, we develop a set of concepts for analyzing and advancing the practice of data science and improving critical data studies, including (1) communication is central to the data science endeavor; (2) making sense of data is a collective process; (3) data are starting, not end points, and (4) data are sets of stories. We conclude with two calls to action for researchers and practitioners in data science and critical data studies alike. First, creating opportunities for bringing social scientific and humanistic expertise into data science practice simultaneously will advance both data science and critical data studies. Second, practitioners should leverage the insights from critical data studies to build new kinds of organizational arrangements, which we argue will help advance a more ethical data science. Engaging the insights of critical data studies will improve data science. Careful attention to the practices of data science will improve scholarly critiques. Genuine collaborative conversations between these different communities will help push for more ethical, and better, ways of knowing in increasingly datum-saturated societies. PMID:28632445

  8. ACTS data center

    NASA Technical Reports Server (NTRS)

    Syed, Ali; Vogel, Wolfhard J.

    1993-01-01

    Viewgraphs on ACTS Data Center status report are included. Topics covered include: ACTS Data Center Functions; data flow overview; PPD flow; RAW data flow; data compression; PPD distribution; RAW Data Archival; PPD Audit; and data analysis.

  9. Digital Archive Issues from the Perspective of an Earth Science Data Producer

    NASA Technical Reports Server (NTRS)

    Barkstrom, Bruce R.

    2004-01-01

    Contents include the following: Introduction. A Producer Perspective on Earth Science Data. Data Producers as Members of a Scientific Community. Some Unique Characteristics of Scientific Data. Spatial and Temporal Sampling for Earth (or Space) Science Data. The Influence of the Data Production System Architecture. The Spatial and Temporal Structures Underlying Earth Science Data. Earth Science Data File (or Relation) Schemas. Data Producer Configuration Management Complexities. The Topology of Earth Science Data Inventories. Some Thoughts on the User Perspective. Science Data User Communities. Spatial and Temporal Structure Needs of Different Users. User Spatial Objects. Data Search Services. Inventory Search. Parameter (Keyword) Search. Metadata Searches. Documentation Search. Secondary Index Search. Print Technology and Hypertext. Inter-Data Collection Configuration Management Issues. An Archive View. Producer Data Ingest and Production. User Data Searching and Distribution. Subsetting and Supersetting. Semantic Requirements for Data Interchange. Tentative Conclusions. An Object Oriented View of Archive Information Evolution. Scientific Data Archival Issues. A Perspective on the Future of Digital Archives for Scientific Data. References Index for this paper.

  10. Data citation in climate sciences: Improvements in CMIP6 compared to CMIP5

    NASA Astrophysics Data System (ADS)

    Stockhause, M.; Lautenschlager, M.

    2017-12-01

    Within CMIP5 (Coupled Model Intercomparison Project Phase 5) the citation of the data was not possible prior its long-term archival in the IPCC Data Distribution Centre (DDC). The Reference Data Archive for AR5 (Assessment Report 5) was built up after the submission deadline for part 1 of the AR5. This was too late for many scientific articles. But even the AR5 data in the IPCC DDC is rarely cited in literature in spite of annual download volumes between one and three PBytes. On the other hand, the request for a citation possibility for the evolving CMIP6 data prior to long-term archival came from the CMIP6 data providers. The additional provision of data citations for the project input4MIPs (input data for CMIP6) could raise the scientists' awareness of the discrepancy between the readiness to cite data and the desire to be cited and get credit. The CMIP6 Citation Service is a pragmatic approach built on existing services and services under development, such as ESGF (Earth System Grid Federation) as data infrastructure component, DataCite as DOI registration agency, and Scholix services for tracking data usage information. Other principles followed to overcome barriers of data citation are: Collect data and literature references in the data citation metadata to enable data-data and data-literature interlinking. Visibility of data citation information in the ESGF data portals (low barrier to access data citation information) Provide data usage information in literature for the data providers, data node managers and their funders (requested by some ESGF data node managers) The CMIP6 Citation Service is an implementation only of the credit part of the RDA WGDC recommendation for the citation of dynamic data. The second part, the identification of the data subset underlying an article, is planned for CMIP7 as a data cart approach comprising multiple pre-defined CMIP6 DataCite DOIs. Additional policies on the long-term data availability are required. References: M. Stockhause and M. Lautenschlager (2017). CMIP6 Data Citation of Evolving Data. Data Science Journal. 16, p.30. doi:10.5334/dsj-2017-030. https://doi.org/10.5334/dsj-2017-030 . http://cmip6cite.wdc-climate.de

  11. IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.

    2014-12-01

    The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to foster such partnerships with streamlined data services, including user-friendly, single-point interfaces for data submission, discovery, and access across the partner systems to support interdisciplinary science.

  12. Data governance in predictive toxicology: A review.

    PubMed

    Fu, Xin; Wojak, Anna; Neagu, Daniel; Ridley, Mick; Travis, Kim

    2011-07-13

    Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area.

  13. Data governance in predictive toxicology: A review

    PubMed Central

    2011-01-01

    Background Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). Results This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. Conclusions While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area. PMID:21752279

  14. Data Overview: Overview of an Epidemic

    MedlinePlus

    ... the Epidemic Commonly Used Terms Prescription Opioids Heroin Fentanyl Data Opioid Data Analysis Drug Overdose Death Data ... Overdose Data Heroin Overdose Data Synthetic Opioid Data Fentanyl Encounters Data Overdose Prevention Improve Opioid Prescribing Prevent ...

  15. Data Resources | Geospatial Data Science | NREL

    Science.gov Websites

    variety of renewable energy technologies. These datasets are designed to be used in GIS software applications. Biomass Data Geothermal Data Hydrogen Data Marine and Hydrokinetic Data Solar Data Wind Data

  16. Analysis Resistant Cipher Method and Apparatus

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2009-01-01

    A system for encoding and decoding data words including an anti-analysis encoder unit for receiving an original plaintext and producing a recoded data, a data compression unit for receiving the recoded data and producing a compressed recoded data, and an encryption unit for receiving the compressed recoded data and producing an encrypted data. The recoded data has an increased non-correlatable data redundancy compared with the original plaintext in order to mask the statistical distribution of characters in the plaintext data. The system of the present invention further includes a decryption unit for receiving the encrypted data and producing a decrypted data, a data decompression unit for receiving the decrypted data and producing an uncompressed recoded data, and an anti-analysis decoder unit for receiving the uncompressed recoded data and producing a recovered plaintext that corresponds with the original plaintext.

  17. Data Model Performance in Data Warehousing

    NASA Astrophysics Data System (ADS)

    Rorimpandey, G. C.; Sangkop, F. I.; Rantung, V. P.; Zwart, J. P.; Liando, O. E. S.; Mewengkang, A.

    2018-02-01

    Data Warehouses have increasingly become important in organizations that have large amount of data. It is not a product but a part of a solution for the decision support system in those organizations. Data model is the starting point for designing and developing of data warehouses architectures. Thus, the data model needs stable interfaces and consistent for a longer period of time. The aim of this research is to know which data model in data warehousing has the best performance. The research method is descriptive analysis, which has 3 main tasks, such as data collection and organization, analysis of data and interpretation of data. The result of this research is discussed in a statistic analysis method, represents that there is no statistical difference among data models used in data warehousing. The organization can utilize four data model proposed when designing and developing data warehouse.

  18. Pricing and disseminating customer data with privacy awareness.

    PubMed

    Li, Xiao-Bai; Raghunathan, Srinivasan

    2014-03-01

    Organizations today regularly share their customer data with their partners to gain competitive advantages. They are also often requested or even required by a third party to provide customer data that are deemed sensitive. In these circumstances, organizations are obligated to protect the privacy of the individuals involved while still benefiting from sharing data or meeting the requirement for releasing data. In this study, we analyze the tradeoff between privacy and data utility from the perspective of the data owner. We develop an incentive-compatible mechanism for the data owner to price and disseminate private data. With this mechanism, a data user is motivated to reveal his true purpose of data usage and acquire the data that suits to that purpose. Existing economic studies of information privacy primarily consider the interplay between the data owner and the individuals, focusing on problems that occur in the collection of private data. This study, however, examines the privacy issue facing a data owner organization in the distribution of private data to a third party data user when the real purpose of data usage is unclear and the released data could be misused.

  19. Pricing and disseminating customer data with privacy awareness

    PubMed Central

    Raghunathan, Srinivasan

    2014-01-01

    Organizations today regularly share their customer data with their partners to gain competitive advantages. They are also often requested or even required by a third party to provide customer data that are deemed sensitive. In these circumstances, organizations are obligated to protect the privacy of the individuals involved while still benefiting from sharing data or meeting the requirement for releasing data. In this study, we analyze the tradeoff between privacy and data utility from the perspective of the data owner. We develop an incentive-compatible mechanism for the data owner to price and disseminate private data. With this mechanism, a data user is motivated to reveal his true purpose of data usage and acquire the data that suits to that purpose. Existing economic studies of information privacy primarily consider the interplay between the data owner and the individuals, focusing on problems that occur in the collection of private data. This study, however, examines the privacy issue facing a data owner organization in the distribution of private data to a third party data user when the real purpose of data usage is unclear and the released data could be misused. PMID:24839337

  20. Data, Analysis, and Visualization | Computational Science | NREL

    Science.gov Websites

    Data, Analysis, and Visualization Data, Analysis, and Visualization Data management, data analysis . At NREL, our data management, data analysis, and scientific visualization capabilities help move the approaches to image analysis and computer vision. Data Management and Big Data Systems, software, and tools

  1. Sharing Responsibility for Data Stewardship Between Scientists and Curators

    NASA Astrophysics Data System (ADS)

    Hedstrom, M. L.

    2012-12-01

    Data stewardship is becoming increasingly important to support accurate conclusions from new forms of data, integration of and computation across heterogeneous data types, interactions between models and data, replication of results, data governance and long-term archiving. In addition to increasing recognition of the importance of data management, data science, and data curation by US and international scientific agencies, the National Academies of Science Board on Research Data and Information is sponsoring a study on Data Curation Education and Workforce Issues. Effective data stewardship requires a distributed effort among scientists who produce data, IT staff and/or vendors who provide data storage and computational facilities and services, and curators who enhance data quality, manage data governance, provide access to third parties, and assume responsibility for long-term archiving of data. The expertise necessary for scientific data management includes a mix of knowledge of the scientific domain; an understanding of domain data requirements, standards, ontologies and analytical methods; facility with leading edge information technology; and knowledge of data governance, standards, and best practices for long-term preservation and access that rarely are found in a single individual. Rather than developing data science and data curation as new and distinct occupations, this paper examines the set of tasks required for data stewardship. The paper proposes an alternative model that embeds data stewardship in scientific workflows and coordinates hand-offs between instruments, repositories, analytical processing, publishers, distributors, and archives. This model forms the basis for defining knowledge and skill requirements for specific actors in the processes required for data stewardship and the corresponding educational and training needs.

  2. NSSDC data listing

    NASA Technical Reports Server (NTRS)

    Horowitz, Richard; King, Joseph H.

    1990-01-01

    In a highly summarized way, data available from the National Space Science Data Center (NSSDC) is identified. Most data are offline data sets (on magnetic tape or as film/print products of various sizes) from individual instruments carried on spacecraft; these compose the Satellite Data Listing. Descriptive names, time spans, data form, and quantity of these data sets are identified in the listing, which is sorted alphabetically-first by spacecraft name and then by the principal investigator's or team leader's last name. Several data sets held at NSSDC, not associated with individual spaceflight instruments, are identified in separate listings following the Satellite Data Listing. These data sets make up the Supplementary Data Listings and include composite spacecraft data sets, ground-based data, models, and computer routines. The identifiers used in the Supplementary Data Listings were created by NSSDC and are explained in the pages preceding the listings. Data set form codes are listed. NSSDC offers primarily archival, retrieval, replication, and dissemination services associated with the data sets discussed in the two major listings identified above. NSSDC also provides documentation which enables the data recipient to use the data received. NSSDC is working toward expanding presently limited capabilities for data subsetting and for promotion of data files to online residence for user downloading. NSSDC data holdings span the range of scientific disciplines in which NASA is involved, and include astrophysics, lunar and planetary science, solar physics, space plasma physics, and Earth science. In addition to the functions mentioned above, NSSDC offers data via special services and systems in a number of areas, including Astronomical Data Center (ADC), Coordinated Data Analysis Workshops (CDAWs), NASA Climate Data System (NCDS), Pilot Land Data System (PLDS), and Crustal Dynamics Data Information System (CDDIS). Furthermore, NSSDC has a no-password account on its SPAN/Telenet-accessible VAX through which the NASA Master Directory and selected online data bases are accessible and through which any data described here may be ordered. Astrophysics data support by NSSDC is not limited to the ADC. Each of these special services/systems is described briefly.

  3. Big Data Analytics in Medicine and Healthcare.

    PubMed

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  4. CIRSS vertical data integration, San Bernardino County study phases 1-A, 1-B

    NASA Technical Reports Server (NTRS)

    Christenson, J.; Michel, R. (Principal Investigator)

    1981-01-01

    User needs, data types, data automation, and preliminary applications are described for an effort to assemble a single data base for San Bernardino County from data bases which exist at several administrative levels. Each of the data bases used was registered and converted to a grid-based data file at a resolution of 4 acres and used to create a multivariable data base for the entire study area. To this data base were added classified LANDSAT data from 1976 and 1979. The resulting data base thus integrated in a uniform format all of the separately automated data within the study area. Several possible interactions between existing geocoded data bases and LANDSAT data were tested. The use of LANDSAT to update existing data base is to be tested.

  5. Minimally buffered data transfers between nodes in a data communications network

    DOEpatents

    Miller, Douglas R.

    2015-06-23

    Methods, apparatus, and products for minimally buffered data transfers between nodes in a data communications network are disclosed that include: receiving, by a messaging module on an origin node, a storage identifier, a origin data type, and a target data type, the storage identifier specifying application storage containing data, the origin data type describing a data subset contained in the origin application storage, the target data type describing an arrangement of the data subset in application storage on a target node; creating, by the messaging module, origin metadata describing the origin data type; selecting, by the messaging module from the origin application storage in dependence upon the origin metadata and the storage identifier, the data subset; and transmitting, by the messaging module to the target node, the selected data subset for storing in the target application storage in dependence upon the target data type without temporarily buffering the data subset.

  6. A Metadata Action Language

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Clancy, Dan (Technical Monitor)

    2001-01-01

    The data management problem comprises data processing and data tracking. Data processing is the creation of new data based on existing data sources. Data tracking consists of storing metadata descriptions of available data. This paper addresses the data management problem by casting it as an AI planning problem. Actions are data-processing commands, plans are dataflow programs and goals are metadata descriptions of desired data products. Data manipulation is simply plan generation and execution, and a key component of data tracking is inferring the effects of an observed plan. We introduce a new action language for data management domains, called ADILM. We discuss the connection between data processing and information integration and show how a language for the latter must be modified to support the former. The paper also discusses information gathering within a data-processing framework, and show how ADILM metadata expressions are a generalization of Local Completeness.

  7. Processing data base information having nonwhite noise

    DOEpatents

    Gross, Kenneth C.; Morreale, Patricia

    1995-01-01

    A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.

  8. A Clinical Data Warehouse Based on OMOP and i2b2 for Austrian Health Claims Data.

    PubMed

    Rinner, Christoph; Gezgin, Deniz; Wendl, Christopher; Gall, Walter

    2018-01-01

    To develop simulation models for healthcare related questions clinical data can be reused. Develop a clinical data warehouse to harmonize different data sources in a standardized manner and get a reproducible interface for clinical data reuse. The Kimball life cycle for the development of data warehouse was used. The development is split into the technical, the data and the business intelligence pathway. Sample data was persisted in the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM). The i2b2 clinical data warehouse tools were used to query the OMOP CDM by applying the new i2b2 multi-fact table feature. A clinical data warehouse was set up and sample data, data dimensions and ontologies for Austrian health claims data were created. The ability of the standardized data access layer to create and apply simulation models will be evaluated next.

  9. Data Recipes: Toward Creating How-To Knowledge Base for Earth Science Data

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Lynnes, Chris; Acker, James G.; Beaty, Tammy

    2015-01-01

    Both the diversity and volume of Earth science data from satellites and numerical models are growing dramatically, due to an increasing population of measured physical parameters, and also an increasing variety of spatial and temporal resolutions for many data products. To further complicate matters, Earth science data delivered to data archive centers are commonly found in different formats and structures. NASA data centers, managed by the Earth Observing System Data and Information System (EOSDIS), have developed a rich and diverse set of data services and tools with features intended to simplify finding, downloading, and working with these data. Although most data services and tools have user guides, many users still experience difficulties with accessing or reading data due to varying levels of familiarity with data services, tools, and or formats. The data recipe project at Goddard Earth Science Data and Information Services Center (GES DISC) was initiated in late 2012 for enhancing user support. A data recipe is a How-To online explanatory document, with step-by-step instructions and examples of accessing and working with real data (http:disc.sci.gsfc.nasa.govrecipes). The current suite of recipes has been found to be very helpful, especially to first-time-users of particular data services, tools, or data products. Online traffic to the data recipe pages is significant, even though the data recipe topics are still limited. An Earth Science Data System Working Group (ESDSWG) for data recipes was established in the spring of 2014, aimed to initiate an EOSDIS-wide campaign for leveraging the distributed knowledge within EOSDIS and its user communities regarding their respective services and tools. The ESDSWG data recipe group is working on an inventory and analysis of existing data recipes and tutorials, and will provide guidelines and recommendation for writing and grouping data recipes, and for cross linking recipes to data products. This presentation gives an overview of the data recipe activites at GES DISC and ESDSWG. We are seeking requirements and input from a broader data user community to establish a strong knowledge base for Earth science data research and application implementations.

  10. Semi-automated Data Set Submission Work Flow for Archival with the ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Wright, D.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Eby, P.; Heinz, S. L.; Hook, L. A.; McMurry, B. F.; Shanafield, H. A.; Sill, D.; Santhana Vannan, S.; Wei, Y.

    2013-12-01

    The ORNL DAAC archives and publishes, free of charge, data and information relevant to biogeochemical, ecological, and environmental processes. The ORNL DAAC primarily archives data produced by NASA's Terrestrial Ecology Program; however, any data that are pertinent to the biogeochemical and ecological community are of interest. The data set submission process to the ORNL DAAC has been recently updated and semi-automated to provide a consistent data provider experience and to create a uniform data product. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. If the ORNL DAAC is the appropriate archive for a data set, the data provider will be sent an email with several URL links to guide them through the submission process. The data provider will be asked to fill out a short online form to help the ORNL DAAC staff better understand the data set. These questions cover information about the data set, a description of the data set, temporal and spatial characteristics of the data set, and how the data were prepared and delivered. The questionnaire is generic and has been designed to gather input on the various diverse data sets the ORNL DAAC archives. A data upload module and metadata editor further guide the data provider through the submission process. For submission purposes, a complete data set includes data files, document(s) describing data, supplemental files, metadata record(s), and an online form. There are five major functions the ORNL DAAC performs during the process of archiving data: 1) Ingestion is the ORNL DAAC side of submission; data are checked, metadata records are compiled, and files are converted to archival formats. 2) Metadata records and data set documentation made searchable and the data set is given a permanent URL. 3) The data set is published, assigned a DOI, and advertised. 4) The data set is provided long-term post-project support. 5) Stewardship of data ensures the data are stored on state of the art computer systems with reliable backups.

  11. Replacing missing values using trustworthy data values from web data sources

    NASA Astrophysics Data System (ADS)

    Izham Jaya, M.; Sidi, Fatimah; Mat Yusof, Sharmila; Suriani Affendey, Lilly; Ishak, Iskandar; Jabar, Marzanah A.

    2017-09-01

    In practice, collected data usually are incomplete and contains missing value. Existing approaches in managing missing values overlook the importance of trustworthy data values in replacing missing values. In view that trusted completed data is very important in data analysis, we proposed a framework of missing value replacement using trustworthy data values from web data sources. The proposed framework adopted ontology to map data values from web data sources to the incomplete dataset. As data from web is conflicting with each other, we proposed a trust score measurement based on data accuracy and data reliability. Trust score is then used to select trustworthy data values from web data sources for missing values replacement. We successfully implemented the proposed framework using financial dataset and presented the findings in this paper. From our experiment, we manage to show that replacing missing values with trustworthy data values is important especially in a case of conflicting data to solve missing values problem.

  12. Earth Observation Data Quality Monitoring and Control: A Case Study of STAR Central Data Repository

    NASA Astrophysics Data System (ADS)

    Han, W.; Jochum, M.

    2017-12-01

    Earth observation data quality is very important for researchers and decision makers involved in weather forecasting, severe weather warning, disaster and emergency response, environmental monitoring, etc. Monitoring and control earth observation data quality, especially accuracy, completeness, and timeliness, is very useful in data management and governance to optimize data flow, discover potential transmission issues, and better connect data providers and users. Taking a centralized near real-time satellite data repository, STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR), as an example, this paper describes how to develop new mechanism to verify data integrity, check data completeness, and monitor data latency in an operational data management system. Such quality monitoring and control of large volume satellite data help data providers and managers improve data transmission of near real-time satellite data, enhance its acquisition and management, and overcome performance and management issues to better serve research and development activities.

  13. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  14. Data Preparation Process for the Buildings Performance Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, Travis; Dunn, Laurel; Mercado, Andrea

    2014-06-30

    The Buildings Performance Database (BPD) includes empirically measured data from a variety of data sources with varying degrees of data quality and data availability. The purpose of the data preparation process is to maintain data quality within the database and to ensure that all database entries have sufficient data for meaningful analysis and for the database API. Data preparation is a systematic process of mapping data into the Building Energy Data Exchange Specification (BEDES), cleansing data using a set of criteria and rules of thumb, and deriving values such as energy totals and dominant asset types. The data preparation processmore » takes the most amount of effort and time therefore most of the cleansing process has been automated. The process also needs to adapt as more data is contributed to the BPD and as building technologies over time. The data preparation process is an essential step between data contributed by providers and data published to the public in the BPD.« less

  15. Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data

    NASA Astrophysics Data System (ADS)

    Stockhause, M.; Höck, H.; Toussaint, F.; Lautenschlager, M.

    2012-08-01

    The preservation of data in a high state of quality which is suitable for interdisciplinary use is one of the most pressing and challenging current issues in long-term archiving. For high volume data such as climate model data, the data and data replica are no longer stored centrally but distributed over several local data repositories, e.g. the data of the Climate Model Intercomparison Project Phase 5 (CMIP5). The most important part of the data is to be archived, assigned a DOI, and published according to the World Data Center for Climate's (WDCC) application of the DataCite regulations. The integrated part of WDCC's data publication process, the data quality assessment, was adapted to the requirements of a federated data infrastructure. A concept of a distributed and federated quality assessment procedure was developed, in which the workload and responsibility for quality control is shared between the three primary CMIP5 data centers: Program for Climate Model Diagnosis and Intercomparison (PCMDI), British Atmospheric Data Centre (BADC), and WDCC. This distributed quality control concept, its pilot implementation for CMIP5, and first experiences are presented. The distributed quality control approach is capable of identifying data inconsistencies and to make quality results immediately available for data creators, data users and data infrastructure managers. Continuous publication of new data versions and slow data replication prevents the quality control from check completion. This together with ongoing developments of the data and metadata infrastructure requires adaptations in code and concept of the distributed quality control approach.

  16. Device Data Ingestion for Industrial Big Data Platforms with a Case Study †

    PubMed Central

    Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei

    2016-01-01

    Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121

  17. Geographic Data as Personal Data in Four EU Member States

    NASA Astrophysics Data System (ADS)

    de Jong, A. J.; van Loenen, B.; Zevenbergen, J. A.

    2016-06-01

    The EU Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data aims at harmonising data protection legislation in the European Union. This should promote the free flow of products and services within the EU. This research found a wide variety of interpretations of the application of data protection legislation to geographic data. The variety was found among the different EU Member States, the different stakeholders and the different types of geographic data. In the Netherlands, the Data Protection Authority (DPA) states that panoramic images of streets are considered personal data. While Dutch case law judges that the data protection legislation does not apply if certain features are blurred and no link to an address is provided. The topographic datasets studied in the case studies do not contain personal data, according to the Dutch DPA, while the German DPA and the Belgian DPA judge that topographic maps of a large scale can contain personal data, and impose conditions on the processing of topographic maps. The UK DPA does consider this data outside of the scope of legal definition of personal data. The patchwork of differences in data protection legislation can be harmonised by using a traffic light model. This model focuses on the context in which the processing of the data takes place and has four categories of data: (1) sensitive personal data, (2) personal data, (3), data that can possibly lead to identification, and (4) non-personal data. For some geographic data, for example factual data that does not reveal sensitive information about a person, can be categorised in the third category giving room to opening up data under the INSPIRE Directive.

  18. Transparent Reporting of Data Quality in Distributed Data Networks

    PubMed Central

    Kahn, Michael G.; Brown, Jeffrey S.; Chun, Alein T.; Davidson, Bruce N.; Meeker, Daniella; Ryan, Patrick B.; Schilling, Lisa M.; Weiskopf, Nicole G.; Williams, Andrew E.; Zozus, Meredith Nahm

    2015-01-01

    Introduction: Poor data quality can be a serious threat to the validity and generalizability of clinical research findings. The growing availability of electronic administrative and clinical data is accompanied by a growing concern about the quality of these data for observational research and other analytic purposes. Currently, there are no widely accepted guidelines for reporting quality results that would enable investigators and consumers to independently determine if a data source is fit for use to support analytic inferences and reliable evidence generation. Model and Methods: We developed a conceptual model that captures the flow of data from data originator across successive data stewards and finally to the data consumer. This “data lifecycle” model illustrates how data quality issues can result in data being returned back to previous data custodians. We highlight the potential risks of poor data quality on clinical practice and research results. Because of the need to ensure transparent reporting of a data quality issues, we created a unifying data-quality reporting framework and a complementary set of 20 data-quality reporting recommendations for studies that use observational clinical and administrative data for secondary data analysis. We obtained stakeholder input on the perceived value of each recommendation by soliciting public comments via two face-to-face meetings of informatics and comparative-effectiveness investigators, through multiple public webinars targeted to the health services research community, and with an open access online wiki. Recommendations: Our recommendations propose reporting on both general and analysis-specific data quality features. The goals of these recommendations are to improve the reporting of data quality measures for studies that use observational clinical and administrative data, to ensure transparency and consistency in computing data quality measures, and to facilitate best practices and trust in the new clinical discoveries based on secondary use of observational data. PMID:25992385

  19. Data Publishing - View from the Front

    NASA Astrophysics Data System (ADS)

    Carlson, David; Pfeiffenberger, Hans

    2014-05-01

    As data publishing journals - Earth System Science Data (ESSD, Copernicus, since 2009), Geophysical Data Journal (GDJ, Wiley, recent) and Scientific Data (SD, Nature Publishing Group, anticipated from May 2014) - expose data sets, implement data description and data review practices, and develop partnerships with data centres and data providers, we anticipate substantial benefits for the broad earth system and environmental research communities but also substantial challenges for all parties. A primary advantage emerges from open access to convergent data: subsurface hydrographic data near Antarctica, for example, now available for combination and comparison with nearby atmospheric data (both documented in ESSD), basin-scale precipitation data (accessed through GDJ) for comparison and interpolation with long-term global precipitation records (accessed from ESSD), or, imagining not too far into the future, stomach content and abundance data for European fish (from ESSD) linked to genetic or nutritional data (from SD). In addition to increased opportunity for discovery and collaboration, we also notice parallel developments of new tools for (published) data visualization and display and increasing acceptance of data publication as a useful and anticipated dissemination step included in project- and institution-based data management plans. All parties - providers, publishers and users - will benefit as various indexing services (SCI, SCOPUS, DCI etc.) acknowledge the creative, intellectual and meritorious efforts of data preparation and data provision. The challenges facing data publication, in most cases very familiar to the data community but made more acute by the advances in data publishing, include diverging metadata standards (among biomedical, green ocean modeling and meteorological communities, for example), adhering to standards and practices for permanent identification while also accommodating 'living' data, and maintaining prompt but rigorous review and evaluation processes in the face of unfamiliarity and overwhelming workloads.

  20. Optimizing Data Center Services to Foster Stewardship and Use of Geospatial Data by Heterogeneous Populations of Users

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.; de Sherbinin, A. M.

    2017-12-01

    Growing recognition of the importance of sharing scientific data more widely and openly has refocused attention on the state of data repositories, including both discipline- or topic-oriented data centers and institutional repositories. Data creators often have several alternatives for depositing and disseminating their natural, social, health, or engineering science data. In selecting a repository for their data, data creators and other stakeholders such as their funding agencies may wish to consider the user community or communities served, the type and quality of data products already offered, and the degree of data stewardship and associated services provided. Some data repositories serve general communities, e.g., those in their host institution or region, whereas others tailor their services to particular scientific disciplines or topical areas. Some repositories are selective when acquiring data and conduct extensive curation and reviews to ensure that data products meet quality standards. Many repositories have secured credentials and established a track record for providing trustworthy, high quality data and services. The NASA Socioeconomic Data and Applications Center (SEDAC) serves users interested in human-environment interactions, including researchers, students, and applied users from diverse sectors. SEDAC is selective when choosing data for dissemination, conducting several reviews of data products and services prior to release. SEDAC works with data producers to continually improve the quality of its open data products and services. As a Distributed Active Archive Center (DAAC) of the NASA Earth Observing System Data and Information System, SEDAC is committed to improving the accessibility, interoperability, and usability of its data in conjunction with data available from other DAACs, as well as other relevant data sources. SEDAC is certified as a Regular Member of the International Council for Science World Data System (ICSU-WDS).

  1. "Small" data in a big data world: archiving terrestrial ecology data at ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Boyer, A.; Deb, D.; Hook, L.; Shrestha, R.; Thornton, M.; Virdi, M.; Wei, Y.; Wright, D.

    2016-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC http://daac.ornl.gov), a NASA-funded data center, archives a diverse collection of terrestrial biogeochemistry and ecological dynamics observations and models in support of NASA's Earth Science program. The ORNL DAAC has been addressing the increasing challenge of publishing diverse small data products into an online archive while dealing with the enhanced need for integration and availability of these data to address big science questions. This paper will show examples of "small" diverse data holdings - ranging from the Daymet model output data to site-based soil moisture observation data. We define "small" by the data volume of these data products compared to petabyte scale observations. We will highlight the use of tools and services for visualizing diverse data holdings and subsetting services such as the MODIS land products subsets tool (at ORNL DAAC) that provides big MODIS data in small chunks. Digital Object Identifiers (DOI) and data citations have enhanced the availability of data. The challenge faced by data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing small diverse data products into an online archive. This paper will also present our experiences designing a data curation system for these types of data. The characteristics of these data will be examined and their scientific value will be demonstrated via data citation metrics. We will present case studies of leveraging specialized tools and services that have enabled small data sets to realize their "big" scientific potential. Overall, we will provide a holistic view of the challenges and potential of small diverse terrestrial ecology data sets from data curation to distribution.

  2. Use of Schema on Read in Earth Science Data Archives

    NASA Technical Reports Server (NTRS)

    Hegde, Mahabaleshwara; Smit, Christine; Pilone, Paul; Petrenko, Maksym; Pham, Long

    2017-01-01

    Traditionally, NASA Earth Science data archives have file-based storage using proprietary data file formats, such as HDF and HDF-EOS, which are optimized to support fast and efficient storage of spaceborne and model data as they are generated. The use of file-based storage essentially imposes an indexing strategy based on data dimensions. In most cases, NASA Earth Science data uses time as the primary index, leading to poor performance in accessing data in spatial dimensions. For example, producing a time series for a single spatial grid cell involves accessing a large number of data files. With exponential growth in data volume due to the ever-increasing spatial and temporal resolution of the data, using file-based archives poses significant performance and cost barriers to data discovery and access. Storing and disseminating data in proprietary data formats imposes an additional access barrier for users outside the mainstream research community. At the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), we have evaluated applying the schema-on-read principle to data access and distribution. We used Apache Parquet to store geospatial data, and have exposed data through Amazon Web Services (AWS) Athena, AWS Simple Storage Service (S3), and Apache Spark. Using the schema-on-read approach allows customization of indexing spatially or temporally to suit the data access pattern. The storage of data in open formats such as Apache Parquet has widespread support in popular programming languages. A wide range of solutions for handling big data lowers the access barrier for all users. This presentation will discuss formats used for data storage, frameworks with This presentation will discuss formats used for data storage, frameworks with support for schema-on-read used for data access, and common use cases covering data usage patterns seen in a geospatial data archive.

  3. Querying Semi-Structured Data

    NASA Technical Reports Server (NTRS)

    Abiteboul, Serge

    1997-01-01

    The amount of data of all kinds available electronically has increased dramatically in recent years. The data resides in different forms, ranging from unstructured data in the systems to highly structured in relational database systems. Data is accessible through a variety of interfaces including Web browsers, database query languages, application-specic interfaces, or data exchange formats. Some of this data is raw data, e.g., images or sound. Some of it has structure even if the structure is often implicit, and not as rigid or regular as that found in standard database systems. Sometimes the structure exists but has to be extracted from the data. Sometimes also it exists but we prefer to ignore it for certain purposes such as browsing. We call here semi-structured data this data that is (from a particular viewpoint) neither raw data nor strictly typed, i.e., not table-oriented as in a relational model or sorted-graph as in object databases. As will seen later when the notion of semi-structured data is more precisely de ned, the need for semi-structured data arises naturally in the context of data integration, even when the data sources are themselves well-structured. Although data integration is an old topic, the need to integrate a wider variety of data- formats (e.g., SGML or ASN.1 data) and data found on the Web has brought the topic of semi-structured data to the forefront of research. The main purpose of the paper is to isolate the essential aspects of semi- structured data. We also survey some proposals of models and query languages for semi-structured data. In particular, we consider recent works at Stanford U. and U. Penn on semi-structured data. In both cases, the motivation is found in the integration of heterogeneous data.

  4. Writing through Big Data: New Challenges and Possibilities for Data-Driven Arguments

    ERIC Educational Resources Information Center

    Beveridge, Aaron

    2017-01-01

    As multimodal writing continues to shift and expand in the era of Big Data, writing studies must confront the new challenges and possibilities emerging from data mining, data visualization, and data-driven arguments. Often collected under the broad banner of "data literacy," students' experiences of data visualization and data-driven…

  5. Multi-registration of software library resources

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-04-05

    Data communications, including issuing, by an application program to a high level data communications library, a request for initialization of a data communications service; issuing to a low level data communications library a request for registration of data communications functions; registering the data communications functions, including instantiating a factory object for each of the one or more data communications functions; issuing by the application program an instruction to execute a designated data communications function; issuing, to the low level data communications library, an instruction to execute the designated data communications function, including passing to the low level data communications library a call parameter that identifies a factory object; creating with the identified factory object the data communications object that implements the data communications function according to the protocol; and executing by the low level data communications library the designated data communications function.

  6. Solutions for research data from a publisher's perspective

    NASA Astrophysics Data System (ADS)

    Cotroneo, P.

    2015-12-01

    Sharing research data has the potential to make research more efficient and reproducible. Elsevier has developed several initiatives to address the different needs of research data users. These include PANGEA Linked data, which provides geo-referenced, citable datasets from earth and life sciences, archived as supplementary data from publications by the PANGEA data repository; Mendeley Data, which allows users to freely upload and share their data; a database linking program that creates links between articles on ScienceDirect and datasets held in external data repositories such as EarthRef and EarthChem; a pilot for searching for research data through a map interface; an open data pilot that allows authors publishing in Elsevier journals to store and share research data and make this publicly available as a supplementary file alongside their article; and data journals, including Data in Brief, which allow researchers to share their data open access. Through these initiatives, researchers are not only encouraged to share their research data, but also supported in optimizing their research data management. By making data more readily citable and visible, and hence generating citations for authors, these initiatives also aim to ensure that researchers get the recognition they deserve for publishing their data.

  7. BrainLiner: A Neuroinformatics Platform for Sharing Time-Aligned Brain-Behavior Data

    PubMed Central

    Takemiya, Makoto; Majima, Kei; Tsukamoto, Mitsuaki; Kamitani, Yukiyasu

    2016-01-01

    Data-driven neuroscience aims to find statistical relationships between brain activity and task behavior from large-scale datasets. To facilitate high-throughput data processing and modeling, we created BrainLiner as a web platform for sharing time-aligned, brain-behavior data. Using an HDF5-based data format, BrainLiner treats brain activity and data related to behavior with the same salience, aligning both behavioral and brain activity data on a common time axis. This facilitates learning the relationship between behavior and brain activity. Using a common data file format also simplifies data processing and analyses. Properties describing data are unambiguously defined using a schema, allowing machine-readable definition of data. The BrainLiner platform allows users to upload and download data, as well as to explore and search for data from the web platform. A WebGL-based data explorer can visualize highly detailed neurophysiological data from within the web browser, and a data-driven search feature allows users to search for similar time windows of data. This increases transparency, and allows for visual inspection of neural coding. BrainLiner thus provides an essential set of tools for data sharing and data-driven modeling. PMID:26858636

  8. Expediting Scientific Data Analysis with Reorganization of Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byna, Surendra; Wu, Kesheng

    2013-08-19

    Data producers typically optimize the layout of data files to minimize the write time. In most cases, data analysis tasks read these files in access patterns different from the write patterns causing poor read performance. In this paper, we introduce Scientific Data Services (SDS), a framework for bridging the performance gap between writing and reading scientific data. SDS reorganizes data to match the read patterns of analysis tasks and enables transparent data reads from the reorganized data. We implemented a HDF5 Virtual Object Layer (VOL) plugin to redirect the HDF5 dataset read calls to the reorganized data. To demonstrate themore » effectiveness of SDS, we applied two parallel data organization techniques: a sort-based organization on a plasma physics data and a transpose-based organization on mass spectrometry imaging data. We also extended the HDF5 data access API to allow selection of data based on their values through a query interface, called SDS Query. We evaluated the execution time in accessing various subsets of data through existing HDF5 Read API and SDS Query. We showed that reading the reorganized data using SDS is up to 55X faster than reading the original data.« less

  9. The Principles for Successful Scientific Data Management Revisited

    NASA Astrophysics Data System (ADS)

    Walker, R. J.; King, T. A.; Joy, S. P.

    2005-12-01

    It has been 23 years since the National Research Council's Committee on Data Management and Computation (CODMAC) published its famous list of principles for successful scientific data management that have provided the framework for modern space science data management. CODMAC outlined seven principles: 1. Scientific Involvement in all aspects of space science missions. 2. Scientific Oversight of all scientific data-management activities. 3. Data Availability - Validated data should be made available to the scientific community in a timely manner. They should include appropriate ancillary data, and complete documentation. 4. Facilities - A proper balance between cost and scientific productivity should be maintained. 5. Software - Transportable well documented software should be available to process and analyze the data. 6. Scientific Data Storage - The data should be preserved in retrievable form. 7. Data System Funding - Adequate data funding should be made available at the outset of missions and protected from overruns. In this paper we will review the lessons learned in trying to apply these principles to space derived data. The Planetary Data System created the concept of data curation to carry out the CODMAC principles. Data curators are scientists and technologists who work directly with the mission scientists to create data products. The efficient application of the CODMAC principles requires that data curators and the mission team start early in a mission to plan for data access and archiving. To build the data products the planetary discipline adopted data access and documentation standards and has adhered to them. The data curators and mission team work together to produce data products and make them available. However even with early planning and agreement on standards the needs of the science community frequently far exceed the available resources. This is especially true for smaller principal investigator run missions. We will argue that one way to make data systems for small missions more effective is for the data curators to provide software tools to help develop the mission data system.

  10. HRP Data Accessibility Current Status

    NASA Technical Reports Server (NTRS)

    Sams, Clarence

    2009-01-01

    Overview of talk: a) Content of Human Life Science data; b) Data archive structure; c) Applicable legal documents and policies; and d) Methods for data access. Life Science Data Archive (LSDA) contains research data from NASA-funded experiments, primarily data from flight experiments and ground analog data collected at NASA facilities. Longitudinal Study of Astronaut Health (LSAH) contains electronic health records (medical data) of all astronauts, including mission data. Data are collected for clinical purposes. Clinical data are analyzed by LSAH epidemiologists to identify trends in crew health and implement changes in pre-, in-, or post-flight medical care.

  11. A review of data quality assessment methods for public health information systems.

    PubMed

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-05-14

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users' concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process.

  12. Data-centric Science: New challenges for long-term archives and data publishers

    NASA Astrophysics Data System (ADS)

    Stockhause, Martina; Lautenschlager, Michael

    2016-04-01

    In the recent years the publication of data has become more and more common. Data and metadata for a single project are often disseminated by multiple data centers in federated data infrastructures. In the same time data is shared earlier to enable collaboration within research projects. The research data environment has become more heterogeneous and the data more dynamic. Only few data or metadata repositories are long-term archives (LTAs) with WDS/DSA certificates, complying to Force 11's 'Joint Declaration of Data Citation Principles'. Therefore for long-term usage of these data and information, a small number of LTAs have the task to preserve these pieces of information. They replicate, connect, quality assure, harmonize, archive, and curate these different types of data from multiple data centers with different operation procedures and data standards. Consortia or federations of certified LTAs are needed to meet the challenges of big data storage and citations. Data publishers play a central role in storing, preserving, and disseminating scientific information. Portals of these federations of LTAs or data registration agencies like DataCite might even become the portals of the future for scientific knowledge discovery. The example CMIP6 is used to illustrate this future perspective of the role of LTAs/data publishers.

  13. A Review of Data Quality Assessment Methods for Public Health Information Systems

    PubMed Central

    Chen, Hong; Hailey, David; Wang, Ning; Yu, Ping

    2014-01-01

    High quality data and effective data quality assessment are required for accurately evaluating the impact of public health interventions and measuring public health outcomes. Data, data use, and data collection process, as the three dimensions of data quality, all need to be assessed for overall data quality assessment. We reviewed current data quality assessment methods. The relevant study was identified in major databases and well-known institutional websites. We found the dimension of data was most frequently assessed. Completeness, accuracy, and timeliness were the three most-used attributes among a total of 49 attributes of data quality. The major quantitative assessment methods were descriptive surveys and data audits, whereas the common qualitative assessment methods were interview and documentation review. The limitations of the reviewed studies included inattentiveness to data use and data collection process, inconsistency in the definition of attributes of data quality, failure to address data users’ concerns and a lack of systematic procedures in data quality assessment. This review study is limited by the coverage of the databases and the breadth of public health information systems. Further research could develop consistent data quality definitions and attributes. More research efforts should be given to assess the quality of data use and the quality of data collection process. PMID:24830450

  14. Development of spatial data guidelines and standards: spatial data set documentation to support hydrologic analysis in the U.S. Geological Survey

    USGS Publications Warehouse

    Fulton, James L.

    1992-01-01

    Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support hydrologic analysis, hydrologic data processing, and publication of hydrologic thermatic maps. There is a need for the GIS vendor community to develop data set documentation tools similar to those developed by the USGS, or to incorporate USGS developed tools in their software.

  15. Data warehouse implementation with clinical pharmacokinetic/pharmacodynamic data.

    PubMed

    Koprowski, S P; Barrett, J S

    2002-03-01

    We have created a data warehouse for human pharmacokinetic (PK) and pharmacodynamic (PD) data generated primarily within the Clinical PK Group of the Drug Metabolism and Pharmacokinetics (DM&PK) Department of DuPont Pharmaceuticals. Data which enters an Oracle-based LIMS directly from chromatography systems or through files from contract research organizations are accessed via SAS/PH.Kinetics, GLP-compliant data analysis software residing on individual users' workstations. Upon completion of the final PK or PD analysis, data are pushed to a predefined location. Data analyzed/created with other software (i.e., WinNonlin, NONMEM, Adapt, etc.) are added to this file repository as well. The warehouse creates views to these data and accumulates metadata on all data sources defined in the warehouse. The warehouse is managed via the SAS/Warehouse Administrator product that defines the environment, creates summarized data structures, and schedules data refresh. The clinical PK/PD warehouse encompasses laboratory, biometric, PK and PD data streams. Detailed logical tables for each compound are created/updated as the clinical PK/PD data warehouse is populated. The data model defined to the warehouse is based on a star schema. Summarized data structures such as multidimensional data bases (MDDB), infomarts, and datamarts are created from detail tables. Data mining and querying of highly summarized data as well as drill-down to detail data is possible via the creation of exploitation tools which front-end the warehouse data. Based on periodic refreshing of the warehouse data, these applications are able to access the most current data available and do not require a manual interface to update/populate the data store. Prototype applications have been web-enabled to facilitate their usage to varied data customers across platform and location. The warehouse also contains automated mechanisms for the construction of study data listings and SAS transport files for eventual incorporation into an electronic submission. This environment permits the management of online analytical processing via a single administrator once the data model and warehouse configuration have been designed. The expansion of the current environment will eventually connect data from all phases of research and development ensuring the return on investment and hopefully efficiencies in data processing unforeseen with earlier legacy systems.

  16. Legal assessment tool (LAT): an interactive tool to address privacy and data protection issues for data sharing.

    PubMed

    Kuchinke, Wolfgang; Krauth, Christian; Bergmann, René; Karakoyun, Töresin; Woollard, Astrid; Schluender, Irene; Braasch, Benjamin; Eckert, Martin; Ohmann, Christian

    2016-07-07

    In an unprecedented rate data in the life sciences is generated and stored in many different databases. An ever increasing part of this data is human health data and therefore falls under data protected by legal regulations. As part of the BioMedBridges project, which created infrastructures that connect more than 10 ESFRI research infrastructures (RI), the legal and ethical prerequisites of data sharing were examined employing a novel and pragmatic approach. We employed concepts from computer science to create legal requirement clusters that enable legal interoperability between databases for the areas of data protection, data security, Intellectual Property (IP) and security of biosample data. We analysed and extracted access rules and constraints from all data providers (databases) involved in the building of data bridges covering many of Europe's most important databases. These requirement clusters were applied to five usage scenarios representing the data flow in different data bridges: Image bridge, Phenotype data bridge, Personalised medicine data bridge, Structural data bridge, and Biosample data bridge. A matrix was built to relate the important concepts from data protection regulations (e.g. pseudonymisation, identifyability, access control, consent management) with the results of the requirement clusters. An interactive user interface for querying the matrix for requirements necessary for compliant data sharing was created. To guide researchers without the need for legal expert knowledge through legal requirements, an interactive tool, the Legal Assessment Tool (LAT), was developed. LAT provides researchers interactively with a selection process to characterise the involved types of data and databases and provides suitable requirements and recommendations for concrete data access and sharing situations. The results provided by LAT are based on an analysis of the data access and sharing conditions for different kinds of data of major databases in Europe. Data sharing for research purposes must be opened for human health data and LAT is one of the means to achieve this aim. In summary, LAT provides requirements in an interactive way for compliant data access and sharing with appropriate safeguards, restrictions and responsibilities by introducing a culture of responsibility and data governance when dealing with human data.

  17. The Materials Data Facility: Data Services to Advance Materials Science Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaiszik, B.; Chard, K.; Pruyne, J.

    2016-07-06

    With increasingly strict data management requirements from funding agencies and institutions, expanding focus on the challenges of research replicability, and growing data sizes and heterogeneity, new data needs are emerging in the materials community. The materials data facility (MDF) operates two cloudhosted services, data publication and data discovery, with features to promote open data sharing, self-service data publication and curation, and encourage data reuse, layered with powerful data discovery tools. The data publication service simplifies the process of copying data to a secure storage location, assigning data a citable persistent identifier, and recording custom (e.g., material, technique, or instrument specific)andmore » automatically-extractedmetadata in a registrywhile the data discovery service will provide advanced search capabilities (e.g., faceting, free text range querying, and full text search) against the registered data and metadata. TheMDF services empower individual researchers, research projects, and institutions to (I) publish research datasets, regardless of size, from local storage, institutional data stores, or cloud storage, without involvement of thirdparty publishers; (II) build, share, and enforce extensible domain-specific custom metadata schemas; (III) interact with published data and metadata via representational state transfer (REST) application program interfaces (APIs) to facilitate automation, analysis, and feedback; and (IV) access a data discovery model that allows researchers to search, interrogate, and eventually build on existing published data. We describe MDF’s design, current status, and future plans.« less

  18. The Materials Data Facility: Data Services to Advance Materials Science Research

    NASA Astrophysics Data System (ADS)

    Blaiszik, B.; Chard, K.; Pruyne, J.; Ananthakrishnan, R.; Tuecke, S.; Foster, I.

    2016-08-01

    With increasingly strict data management requirements from funding agencies and institutions, expanding focus on the challenges of research replicability, and growing data sizes and heterogeneity, new data needs are emerging in the materials community. The materials data facility (MDF) operates two cloud-hosted services, data publication and data discovery, with features to promote open data sharing, self-service data publication and curation, and encourage data reuse, layered with powerful data discovery tools. The data publication service simplifies the process of copying data to a secure storage location, assigning data a citable persistent identifier, and recording custom (e.g., material, technique, or instrument specific) and automatically-extracted metadata in a registry while the data discovery service will provide advanced search capabilities (e.g., faceting, free text range querying, and full text search) against the registered data and metadata. The MDF services empower individual researchers, research projects, and institutions to (I) publish research datasets, regardless of size, from local storage, institutional data stores, or cloud storage, without involvement of third-party publishers; (II) build, share, and enforce extensible domain-specific custom metadata schemas; (III) interact with published data and metadata via representational state transfer (REST) application program interfaces (APIs) to facilitate automation, analysis, and feedback; and (IV) access a data discovery model that allows researchers to search, interrogate, and eventually build on existing published data. We describe MDF's design, current status, and future plans.

  19. Data Integration for Heterogenous Datasets

    PubMed Central

    2014-01-01

    Abstract More and more, the needs of data analysts are requiring the use of data outside the control of their own organizations. The increasing amount of data available on the Web, the new technologies for linking data across datasets, and the increasing need to integrate structured and unstructured data are all driving this trend. In this article, we provide a technical overview of the emerging “broad data” area, in which the variety of heterogeneous data being used, rather than the scale of the data being analyzed, is the limiting factor in data analysis efforts. The article explores some of the emerging themes in data discovery, data integration, linked data, and the combination of structured and unstructured data. PMID:25553272

  20. Methods and apparatus of analyzing electrical power grid data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Critchlow, Terence J.; Gibson, Tara D.

    Apparatus and methods of processing large-scale data regarding an electrical power grid are described. According to one aspect, a method of processing large-scale data regarding an electrical power grid includes accessing a large-scale data set comprising information regarding an electrical power grid; processing data of the large-scale data set to identify a filter which is configured to remove erroneous data from the large-scale data set; using the filter, removing erroneous data from the large-scale data set; and after the removing, processing data of the large-scale data set to identify an event detector which is configured to identify events of interestmore » in the large-scale data set.« less

  1. Efficient data management in a large-scale epidemiology research project.

    PubMed

    Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang

    2012-09-01

    This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  2. Reporting Data with "Over-the-Counter" Data Analysis Supports Improves Educators' Data Analyses

    ERIC Educational Resources Information Center

    Rankin, Jenny Grant

    2014-01-01

    The benefits of making data-informed decisions to improve learning rely on educators correctly interpreting given data. Many educators routinely misinterpret data, even at districts with proactive support for data use. The tool most educators use for data analyses, which is an information technology data system or its reports, typically reports…

  3. ASTM Data Banks and Chemical Information Sources

    ERIC Educational Resources Information Center

    Batik, Albert; Hale, Eleanor

    1972-01-01

    Among the data described are infrared indexes, mass spectral data, chromatographic data, X-ray emmission data, odor and taste threshold data, and thermodynamics data. This paper provides the chemical documentarian a complete reference source to a wide variety of analytical data. (Author/NH)

  4. DataFed: A Federated Data System for Visualization and Analysis of Spatio-Temporal Air Quality Data

    NASA Astrophysics Data System (ADS)

    Husar, R. B.; Hoijarvi, K.

    2017-12-01

    DataFed is a distributed web-services-based computing environment for accessing, processing, and visualizing atmospheric data in support of air quality science and management. The flexible, adaptive environment facilitates the access and flow of atmospheric data from provider to users by enabling the creation of user-driven data processing/visualization applications. DataFed `wrapper' components, non-intrusively wrap heterogeneous, distributed datasets for access by standards-based GIS web services. The mediator components (also web services) map the heterogeneous data into a spatio-temporal data model. Chained web services provide homogeneous data views (e.g., geospatial, time views) using a global multi-dimensional data model. In addition to data access and rendering, the data processing component services can be programmed for filtering, aggregation, and fusion of multidimensional data. A complete application software is written in a custom made data flow language. Currently, the federated data pool consists of over 50 datasets originating from globally distributed data providers delivering surface-based air quality measurements, satellite observations, emissions data as well as regional and global-scale air quality models. The web browser-based user interface allows point and click navigation and browsing the XYZT multi-dimensional data space. The key applications of DataFed are for exploring spatial pattern of pollutants, seasonal, weekly, diurnal cycles and frequency distributions for exploratory air quality research. Since 2008, DataFed has been used to support EPA in the implementation of the Exceptional Event Rule. The data system is also used at universities in the US, Europe and Asia.

  5. Legacy data center integration into distributed data federations: The World Data Center for Climate (WDCC) experience

    NASA Astrophysics Data System (ADS)

    Kindermann, Stephan; Berger, Katharina; Toussaint, Frank

    2014-05-01

    The integration of well-established legacy data centers into newly developed data federation infrastructures is a key requirement to enhance climate data access based on widely agreed interfaces. We present the approach taken to integrate the ICSU World Data Center for Climate (WDCC) located in Hamburg, Germany into the European ENES climate data Federation which is part of the international ESGF data federation. The ENES / ESGF data federation hosts petabytes of climate model data and provides scalable data search and access services across the worldwide distributed data centers. Parts of the data provided by the ENES / ESGF data federation is also long term archived and curated at the WDCC data archive, allowing e.g. for DOI based data citation. An integration of the WDCC into the ENES / ESGF federation allows end users to search and access WDCC data using consistent interfaces worldwide. We will summarize the integration approach we have taken for WDCC legacy system and ESGF infrastructure integration. On the technical side we describe the provisioning of ESGF consistent metadata and data interfaces as well as the security infrastructure adoption. On the non-technical side we describe our experiences in integrating a long-term archival center with costly quality assurance procedures with an integrated distributed data federation putting emphasis on providing early and consistent data search and access services to scientists. The experiences were gained in the process of curating ESGF hosted CMIP5 data at the WDCC. Approximately one petabyte of CMIP5 data which was used for the IPCC climate report is being replicated and archived at the WDCC.

  6. Development of Data Acquisition Set-up for Steady-state Experiments

    NASA Astrophysics Data System (ADS)

    Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.

  7. Next Generation Cloud-based Science Data Systems and Their Implications on Data and Software Stewardship, Preservation, and Provenance

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.

    2017-12-01

    NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.

  8. Evaluating the Quality and Usability of Open Data for Public Health Research: A Systematic Review of Data Offerings on 3 Open Data Platforms.

    PubMed

    Martin, Erika G; Law, Jennie; Ran, Weijia; Helbig, Natalie; Birkhead, Guthrie S

    Government datasets are newly available on open data platforms that are publicly accessible, available in nonproprietary formats, free of charge, and with unlimited use and distribution rights. They provide opportunities for health research, but their quality and usability are unknown. To describe available open health data, identify whether data are presented in a way that is aligned with best practices and usable for researchers, and examine differences across platforms. Two reviewers systematically reviewed a random sample of data offerings on NYC OpenData (New York City, all offerings, n = 37), Health Data NY (New York State, 25% sample, n = 71), and HealthData.gov (US Department of Health and Human Services, 5% sample, n = 75), using a standard coding guide. Three open health data platforms at the federal, New York State, and New York City levels. Data characteristics from the coding guide were aggregated into summary indices for intrinsic data quality, contextual data quality, adherence to the Dublin Core metadata standards, and the 5-star open data deployment scheme. One quarter of the offerings were structured datasets; other presentation styles included charts (14.7%), documents describing data (12.0%), maps (10.9%), and query tools (7.7%). Health Data NY had higher intrinsic data quality (P < .001), contextual data quality (P < .001), and Dublin Core metadata standards adherence (P < .001). All met basic "web availability" open data standards; fewer met higher standards of "hyperlinked to other data." Although all platforms need improvement, they already provide readily available data for health research. Sustained effort on improving open data websites and metadata is necessary for ensuring researchers use these data, thereby increasing their research value.

  9. Data hosting infrastructure for primary biodiversity data

    PubMed Central

    2011-01-01

    Background Today, an unprecedented volume of primary biodiversity data are being generated worldwide, yet significant amounts of these data have been and will continue to be lost after the conclusion of the projects tasked with collecting them. To get the most value out of these data it is imperative to seek a solution whereby these data are rescued, archived and made available to the biodiversity community. To this end, the biodiversity informatics community requires investment in processes and infrastructure to mitigate data loss and provide solutions for long-term hosting and sharing of biodiversity data. Discussion We review the current state of biodiversity data hosting and investigate the technological and sociological barriers to proper data management. We further explore the rescuing and re-hosting of legacy data, the state of existing toolsets and propose a future direction for the development of new discovery tools. We also explore the role of data standards and licensing in the context of data hosting and preservation. We provide five recommendations for the biodiversity community that will foster better data preservation and access: (1) encourage the community's use of data standards, (2) promote the public domain licensing of data, (3) establish a community of those involved in data hosting and archival, (4) establish hosting centers for biodiversity data, and (5) develop tools for data discovery. Conclusion The community's adoption of standards and development of tools to enable data discovery is essential to sustainable data preservation. Furthermore, the increased adoption of open content licensing, the establishment of data hosting infrastructure and the creation of a data hosting and archiving community are all necessary steps towards the community ensuring that data archival policies become standardized. PMID:22373257

  10. Use of Schema on Read in Earth Science Data Archives

    NASA Astrophysics Data System (ADS)

    Petrenko, M.; Hegde, M.; Smit, C.; Pilone, P.; Pham, L.

    2017-12-01

    Traditionally, NASA Earth Science data archives have file-based storage using proprietary data file formats, such as HDF and HDF-EOS, which are optimized to support fast and efficient storage of spaceborne and model data as they are generated. The use of file-based storage essentially imposes an indexing strategy based on data dimensions. In most cases, NASA Earth Science data uses time as the primary index, leading to poor performance in accessing data in spatial dimensions. For example, producing a time series for a single spatial grid cell involves accessing a large number of data files. With exponential growth in data volume due to the ever-increasing spatial and temporal resolution of the data, using file-based archives poses significant performance and cost barriers to data discovery and access. Storing and disseminating data in proprietary data formats imposes an additional access barrier for users outside the mainstream research community. At the NASA Goddard Earth Sciences Data Information Services Center (GES DISC), we have evaluated applying the "schema-on-read" principle to data access and distribution. We used Apache Parquet to store geospatial data, and have exposed data through Amazon Web Services (AWS) Athena, AWS Simple Storage Service (S3), and Apache Spark. Using the "schema-on-read" approach allows customization of indexing—spatial or temporal—to suit the data access pattern. The storage of data in open formats such as Apache Parquet has widespread support in popular programming languages. A wide range of solutions for handling big data lowers the access barrier for all users. This presentation will discuss formats used for data storage, frameworks with support for "schema-on-read" used for data access, and common use cases covering data usage patterns seen in a geospatial data archive.

  11. Distributed Earth observation data integration and on-demand services based on a collaborative framework of geospatial data service gateway

    NASA Astrophysics Data System (ADS)

    Xie, Jibo; Li, Guoqing

    2015-04-01

    Earth observation (EO) data obtained by air-borne or space-borne sensors has the characteristics of heterogeneity and geographical distribution of storage. These data sources belong to different organizations or agencies whose data management and storage methods are quite different and geographically distributed. Different data sources provide different data publish platforms or portals. With more Remote sensing sensors used for Earth Observation (EO) missions, different space agencies have distributed archived massive EO data. The distribution of EO data archives and system heterogeneity makes it difficult to efficiently use geospatial data for many EO applications, such as hazard mitigation. To solve the interoperable problems of different EO data systems, an advanced architecture of distributed geospatial data infrastructure is introduced to solve the complexity of distributed and heterogeneous EO data integration and on-demand processing in this paper. The concept and architecture of geospatial data service gateway (GDSG) is proposed to build connection with heterogeneous EO data sources by which EO data can be retrieved and accessed with unified interfaces. The GDSG consists of a set of tools and service to encapsulate heterogeneous geospatial data sources into homogenous service modules. The GDSG modules includes EO metadata harvesters and translators, adaptors to different type of data system, unified data query and access interfaces, EO data cache management, and gateway GUI, etc. The GDSG framework is used to implement interoperability and synchronization between distributed EO data sources with heterogeneous architecture. An on-demand distributed EO data platform is developed to validate the GDSG architecture and implementation techniques. Several distributed EO data achieves are used for test. Flood and earthquake serves as two scenarios for the use cases of distributed EO data integration and interoperability.

  12. Lunar Data Node: Apollo Data Restoration and Archiving Update

    NASA Technical Reports Server (NTRS)

    Williams, David R.; Hills, Howard K.; Guiness, Edward A.; Taylor, Patrick T.; McBride, Marie Julia

    2013-01-01

    The Lunar Data Node (LDN) of the Planetary Data System (PDS) is responsible for the restoration and archiving of Apollo data. The LDN is located at the National Space Science Data Center (NSSDC), which holds much of the extant Apollo data on microfilm, microfiche, hard-copy documents, and magnetic tapes in older formats. The goal of the restoration effort is to convert the data into user-accessible PDS formats, create a full set of explanatory supporting data (metadata), archive the full data sets through PDS, and post the data online at the PDS Geosciences Node. This will both enable easy use of the data by current researchers and ensure that the data and metadata are securely preserved for future use. We are also attempting to locate and preserve Apollo data which were never archived at NSSDC. We will give a progress report on the data sets we have been restoring and future work.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Jody Rustyn; Poland, Richard W.

    A system and method for the secure storage and transmission of data is provided. A data aggregate device can be configured to receive secure data from a data source, such as a sensor, and encrypt the secure data using a suitable encryption technique, such as a shared private key technique, a public key encryption technique, a Diffie-Hellman key exchange technique, or other suitable encryption technique. The encrypted secure data can be provided from the data aggregate device to different remote devices over a plurality of segregated or isolated data paths. Each of the isolated data paths can include an optoisolatormore » that is configured to provide one-way transmission of the encrypted secure data from the data aggregate device over the isolated data path. External data can be received through a secure data filter which, by validating the external data, allows for key exchange and other various adjustments from an external source.« less

  14. Authenticated sensor interface device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Jody Rustyn; Poland, Richard W.

    A system and method for the secure storage and transmission of data is provided. A data aggregate device can be configured to receive secure data from a data source, such as a sensor, and encrypt the secure data using a suitable encryption technique, such as a shared private key technique, a public key encryption technique, a Diffie-Hellman key exchange technique, or other suitable encryption technique. The encrypted secure data can be provided from the data aggregate device to different remote devices over a plurality of segregated or isolated data paths. Each of the isolated data paths can include an optoisolatormore » that is configured to provide one-way transmission of the encrypted secure data from the data aggregate device over the isolated data path. External data can be received through a secure data filter which, by validating the external data, allows for key exchange and other various adjustments from an external source.« less

  15. SDMS: A scientific data management system

    NASA Technical Reports Server (NTRS)

    Massena, W. A.

    1978-01-01

    SDMS is a data base management system developed specifically to support scientific programming applications. It consists of a data definition program to define the forms of data bases, and FORTRAN-compatible subroutine calls to create and access data within them. Each SDMS data base contains one or more data sets. A data set has the form of a relation. Each column of a data set is defined to be either a key or data element. Key elements must be scalar. Data elements may also be vectors or matrices. The data elements in each row of the relation form an element set. SDMS permits direct storage and retrieval of an element set by specifying the corresponding key element values. To support the scientific environment, SDMS allows the dynamic creation of data bases via subroutine calls. It also allows intermediate or scratch data to be stored in temporary data bases which vanish at job end.

  16. Good Data Can Be Better Data - How Data Management Maturity Can Help Repositories Improve Operations, Data Quality, And Usability, Helping Researchers

    NASA Astrophysics Data System (ADS)

    Stall, S.

    2015-12-01

    Much earth and space science data and metadata are managed and supported by an infrastructure of repositories, ranging from large agency or instrument facilities, to institutions, to smaller repositories including labs. Scientists face many challenges in this ecosystem both on storing their data and in accessing data from others for new research. Critical for all uses is ensuring the credibility and integrity of the data and conveying that and provenance information now and in the future. Accurate information is essential for future researchers to find (or discover) the data, evaluate the data for use (content, temporal, geolocation, precision) and finally select (or discard) that data as meeting a "fit-for-purpose" criteria. We also need to optimize the effort it takes in describing the data for these determinations, which means making it efficient for the researchers who collect the data. At AGU we are developing a program aimed at helping repositories, and thereby researchers, improve data quality and data usability toward these goals. AGU has partnered with the CMMI Institute to develop their Data Management Maturity (DMM) framework within the Earth and space sciences. The CMMI DMM framework guides best practices in a range of data operations, and the application of the DMM, through an assessment, reveals how repositories and institutions can best optimize efforts to improve operations and functionality throughout the data lifecycle and elevate best practices across a variety of data management operations. Supporting processes like data operations, data governance, and data architecture are included. An assessment involves identifying accomplishment, and weaknesses compared to leading practices for data management. Broad application of the DMM can help improve quality in data and operations, and consistency across the community that will facilitate interoperability, discovery, preservation, and reuse. Good data can be better data. Consistency results in sustainability.

  17. The Challenges of Data Quality Evaluation in a Joint Data Warehouse

    PubMed Central

    Bae, Charles J.; Griffith, Sandra; Fan, Youran; Dunphy, Cheryl; Thompson, Nicolas; Urchek, John; Parchman, Alandra; Katzan, Irene L.

    2015-01-01

    Introduction: The use of clinically derived data from electronic health records (EHRs) and other electronic clinical systems can greatly facilitate clinical research as well as operational and quality initiatives. One approach for making these data available is to incorporate data from different sources into a joint data warehouse. When using such a data warehouse, it is important to understand the quality of the data. The primary objective of this study was to determine the completeness and concordance of common types of clinical data available in the Knowledge Program (KP) joint data warehouse, which contains feeds from several electronic systems including the EHR. Methods: A manual review was performed of specific data elements for 250 patients from an EHR, and these were compared with corresponding elements in the KP data warehouse. Completeness and concordance were calculated for five categories of data including demographics, vital signs, laboratory results, diagnoses, and medications. Results: In general, data elements for demographics, vital signs, diagnoses, and laboratory results were present in more cases in the source EHR compared to the KP. When data elements were available in both sources, there was a high concordance. In contrast, the KP data warehouse documented a higher prevalence of deaths and medications compared to the EHR. Discussion: Several factors contributed to the discrepancies between data in the KP and the EHR—including the start date and frequency of data feeds updates into the KP, inability to transfer data located in nonstructured formats (e.g., free text or scanned documents), as well as incomplete and missing data variables in the source EHR. Conclusion: When evaluating the quality of a data warehouse with multiple data sources, assessing completeness and concordance between data set and source data may be better than designating one to be a gold standard. This will allow the user to optimize the method and timing of data transfer in order to capture data with better accuracy. PMID:26290882

  18. Lessons learned from setting up the NOWESP research data base: Experiences in an interdisciplinary research project

    NASA Astrophysics Data System (ADS)

    Radach, Günther; Gekeler, Jens

    1996-09-01

    Research carried out within the framework of the MAST project NOWESP (North-West European Shelf Programme) was based on a multi-parameter data set of existing marine data, relevant for estimating trends, variability and fluxes on the Northwest European Shelf. The data sets were provided by the partners of the project. Additional data sets were obtained from several other institutions. During the project, the data were organized in the NOWESP Research Data Base (NRDB), for which a special data base scheme was defined that was capable of storing different types of marine data. Data products, like time series and interpolated fields, were provided to the partners for analysis (Radach et al. [1997]). After three years of project time, the feasibility of such an approach is discussed. Ways of optimizing data access and evaluation are proposed. A project-oriented Research Data Base is a useful tool because of its flexibility and proximity to the research being carried out. However, several requirements must be met to derive optimum benefits from this type of service unit. Since this task usually is carried out by a limited number of staff, an early start of project data management is recommended. To enable future projects to succeed in an analogous compilation of relevant data for their use, as performed in NOWESP, the task of organizing the data sets for any short-term project should be shared between a research data base group and a national or international data centre whose experience and software could be used. It must be ensured that only quality controlled data sets from the individual data-produ cing projects are delivered to the national data centres. It is recommended that data quality control should be performed by the originators and/or data centres before delivering any data sets to the research data base. Delivery of the (full) data sets should be checked and their quality should be approved by authorized data centres.

  19. The Frictionless Data Package: Data Containerization for Automated Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Shepherd, A.; Fils, D.; Kinkade, D.; Saito, M. A.

    2017-12-01

    As cross-disciplinary geoscience research increasingly relies on machines to discover and access data, one of the critical questions facing data repositories is how data and supporting materials should be packaged for consumption. Traditionally, data repositories have relied on a human's involvement throughout discovery and access workflows. This human could assess fitness for purpose by reading loosely coupled, unstructured information from web pages and documentation. In attempts to shorten the time to science and access data resources across may disciplines, expectations for machines to mediate the process of discovery and access is challenging data repository infrastructure. This challenge is to find ways to deliver data and information in ways that enable machines to make better decisions by enabling them to understand the data and metadata of many data types. Additionally, once machines have recommended a data resource as relevant to an investigator's needs, the data resource should be easy to integrate into that investigator's toolkits for analysis and visualization. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) supports NSF-funded OCE and PLR investigators with their project's data management needs. These needs involve a number of varying data types some of which require multiple files with differing formats. Presently, BCO-DMO has described these data types and the important relationships between the type's data files through human-readable documentation on web pages. For machines directly accessing data files from BCO-DMO, this documentation could be overlooked and lead to misinterpreting the data. Instead, BCO-DMO is exploring the idea of data containerization, or packaging data and related information for easier transport, interpretation, and use. In researching the landscape of data containerization, the Frictionlessdata Data Package (http://frictionlessdata.io/) provides a number of valuable advantages over similar solutions. This presentation will focus on these advantages and how the Frictionlessdata Data Package addresses a number of real-world use cases faced for data discovery, access, analysis and visualization.

  20. New Data Services for Polar Investigators from Integrated Earth Data Applications (IEDA)

    NASA Astrophysics Data System (ADS)

    Nitsche, F. O.; Ferrini, V.; Morton, J. J.; Arko, R. A.; McLain, K.; O'hara, S. H.; Carbotte, S. M.; Lehnert, K. A.; IEDA Team, I.

    2013-12-01

    Accessibility and preservation of data is needed to support multi-disciplinary research in the key environmentally sensitive Polar Regions. IEDA (Integrated Earth Data Applications) is a community-based data facility funded by the US National Science Foundation (NSF) to support, sustain, and advance the geosciences by providing data services for observational solid earth data from the Ocean, Earth, and Polar Sciences. IEDA tools and services relevant to the Polar Research Community include the Antarctic and Southern Ocean Data System (ASODS), the U.S. Antarctic Program Data Coordination Center (USAP-DCC), GeoMapApp, as well as a number of services for sample-based data (SESAR and EarthChem). In addition to existing tools, which assist Polar investigators in archiving their data, and creating DIF records for global searches in AMD, IEDA recently added several new tools and services that will provide further support for investigators with the data life cycle process. These include a data management plan (http://www.iedadata.org/compliance/plan) and data compliance reporting tool (http://www.iedadata.org/compliance/report) that will help investigators comply with the requirements of funding agencies such as the National Science Foundation (NSF). Data, especially from challenging Polar Regions, are likely to be used by other scientists for future studies. Therefore, data acknowledgment is an important concern of many investigators. To encourage data acknowledgments by data users, we link references of publications (when known) to datasets and cruises registered within the ASODS system as part of our data curation services (http://www.marine-geo.org/portals/antarctic/references.php). In addition, IEDA offers a data publication service to register scientific data with DOI's, making data sets citable as publications with attribution to investigators as authors. IEDA is a publication agent of the DataCite consortium. Offering such services provides additional incentives for making data available through data centers. Such tools and services are important building blocks of a coherent and comprehensive (cyber) data support structure for Polar investigators.

  1. --No Title--

    Science.gov Websites

    , the user is responsbile for controlling the quality of observational data # and ensuring data is also # # OUTPUTS: # 1) observational data (named data_obs) and Model data (named data_model) # stored under " observational and model data, stored in correct locations # 2) "data" and "figures" folders

  2. A Semi-Automated Workflow Solution for Data Set Publication

    DOE PAGES

    Vannan, Suresh; Beaty, Tammy W.; Cook, Robert B.; ...

    2016-03-08

    In order to address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI) and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC), a NASA-funded data center, faces these challenges as it deals withmore » data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. Finally, the workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.« less

  3. Linked Data: Forming Partnerships at the Data Layer

    NASA Astrophysics Data System (ADS)

    Shepherd, A.; Chandler, C. L.; Arko, R. A.; Jones, M. B.; Hitzler, P.; Janowicz, K.; Krisnadhi, A.; Schildhauer, M.; Fils, D.; Narock, T.; Groman, R. C.; O'Brien, M.; Patton, E. W.; Kinkade, D.; Rauch, S.

    2015-12-01

    The challenges presented by big data are straining data management software architectures of the past. For smaller existing data facilities, the technical refactoring of software layers become costly to scale across the big data landscape. In response to these challenges, data facilities will need partnerships with external entities for improved solutions to perform tasks such as data cataloging, discovery and reuse, and data integration and processing with provenance. At its surface, the concept of linked open data suggests an uncalculated altruism. Yet, in his concept of five star open data, Tim Berners-Lee explains the strategic costs and benefits of deploying linked open data from the perspective of its consumer and producer - a data partnership. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) addresses some of the emerging needs of its research community by partnering with groups doing complementary work and linking their respective data layers using linked open data principles. Examples will show how these links, explicit manifestations of partnerships, reduce technical debt and provide a swift flexibility for future considerations.

  4. ClinData Express – A Metadata Driven Clinical Research Data Management System for Secondary Use of Clinical Data

    PubMed Central

    Li, Zuofeng; Wen, Jingran; Zhang, Xiaoyan; Wu, Chunxiao; Li, Zuogao; Liu, Lei

    2012-01-01

    Aim to ease the secondary use of clinical data in clinical research, we introduce a metadata driven web-based clinical data management system named ClinData Express. ClinData Express is made up of two parts: 1) m-designer, a standalone software for metadata definition; 2) a web based data warehouse system for data management. With ClinData Express, what the researchers need to do is to define the metadata and data model in the m-designer. The web interface for data collection and specific database for data storage will be automatically generated. The standards used in the system and the data export modular make sure of the data reuse. The system has been tested on seven disease-data collection in Chinese and one form from dbGap. The flexibility of system makes its great potential usage in clinical research. The system is available at http://code.google.com/p/clindataexpress. PMID:23304327

  5. A Semi-Automated Workflow Solution for Data Set Publication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vannan, Suresh; Beaty, Tammy W.; Cook, Robert B.

    In order to address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI) and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC), a NASA-funded data center, faces these challenges as it deals withmore » data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. Finally, the workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.« less

  6. DataONE: Gateway to Earth and Environmental Data Repositories

    NASA Astrophysics Data System (ADS)

    Koskela, R.; Michener, W. K.; Vieglais, D.; Budden, A. E.

    2017-12-01

    DataONE (Data Observation Network for Earth) is a National Science Foundation DataNet project that enables universal access to data and also facilitates researchers in fulfilling their need for data management and in providing secure and permanent access to their data. DataONE offers the scientific community a suite of tools and training materials that cover all aspects of the data life cycle from data collection, to management, analysis and publication. Data repositories affiliated with DataONE are referred to as Member Nodes and represent large regional, national and international research networks, agencies, and other institutions. As part of the DataONE Federation, the repositories gain access to a range of value-added services to support their users. These services include usage tracking and reporting, content replication, and the ability to register the services created by the repository. In addition, DataONE and the California Digital Library manage ONEShare, a repository that accepts content submitted through Dash, a platform allowing researchers to easily describe, deposit and share their research data.

  7. [Contemplation on the application of big data in clinical medicine].

    PubMed

    Lian, Lei

    2015-01-01

    Medicine is another area where big data is being used. The link between clinical treatment and outcome is the key step when applying big data in medicine. In the era of big data, it is critical to collect complete outcome data. Patient follow-up, comprehensive integration of data resources, quality control and standardized data management are the predominant approaches to avoid missing data and data island. Therefore, establishment of systemic patients follow-up protocol and prospective data management strategy are the important aspects of big data in medicine.

  8. Data Prospecting Framework - a new approach to explore "big data" in Earth Science

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Rushing, J.; Lin, A.; Kuo, K.

    2012-12-01

    Due to advances in sensors, computation and storage, cost and effort required to produce large datasets have been significantly reduced. As a result, we are seeing a proliferation of large-scale data sets being assembled in almost every science field, especially in geosciences. Opportunities to exploit the "big data" are enormous as new hypotheses can be generated by combining and analyzing large amounts of data. However, such a data-driven approach to science discovery assumes that scientists can find and isolate relevant subsets from vast amounts of available data. Current Earth Science data systems only provide data discovery through simple metadata and keyword-based searches and are not designed to support data exploration capabilities based on the actual content. Consequently, scientists often find themselves downloading large volumes of data, struggling with large amounts of storage and learning new analysis technologies that will help them separate the wheat from the chaff. New mechanisms of data exploration are needed to help scientists discover the relevant subsets We present data prospecting, a new content-based data analysis paradigm to support data-intensive science. Data prospecting allows the researchers to explore big data in determining and isolating data subsets for further analysis. This is akin to geo-prospecting in which mineral sites of interest are determined over the landscape through screening methods. The resulting "data prospects" only provide an interaction with and feel for the data through first-look analytics; the researchers would still have to download the relevant datasets and analyze them deeply using their favorite analytical tools to determine if the datasets will yield new hypotheses. Data prospecting combines two traditional categories of data analysis, data exploration and data mining within the discovery step. Data exploration utilizes manual/interactive methods for data analysis such as standard statistical analysis and visualization, usually on small datasets. On the other hand, data mining utilizes automated algorithms to extract useful information. Humans guide these automated algorithms and specify algorithm parameters (training samples, clustering size, etc.). Data Prospecting combines these two approaches using high performance computing and the new techniques for efficient distributed file access.

  9. The Environmental Data Initiative data repository: Trustworthy practices that foster preservation, fitness, and reuse for environmental and ecological data

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.

    2017-12-01

    The Environmental Data Initiative (EDI) is an outgrowth of more than 30 years of information management experience and technology from LTER Network data practitioners. EDI builds upon the PASTA data repository software used by the LTER Network Information System and manages more than 42,000 data packages, containing tabular data, imagery, and other formats. Development of the repository was a community process beginning in 2009 that included numerous working groups for generating use cases, system requirements, and testing of completed software, thereby creating a vested interested in its success and transparency in design. All software is available for review on GitHub, and refinements and new features are ongoing. Documentation is also available on Read-the-docs, including a comprehensive description of all web-service API methods. PASTA is metadata driven and uses the Ecological Metadata Language (EML) standard for describing environmental and ecological data; a simplified Dublin Core document is also available for each data package. Data are aggregated into packages consisting of metadata and other related content described by an OAI-ORE document. Once archived, each data package becomes immutable and permanent; updates are possible through the addition of new revisions. Components of each data package are accessible through a unique identifier, while the entire data package receives a DOI that is registered in DataCite. Preservation occurs through a combination of DataONE synchronization/replication and by a series of local and remote backup strategies, including daily uploads to AWS Glacier storage. Checksums are computed for all data at initial upload, with random verification occurring on a continuous basis, thus ensuring the integrity of data. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML before data are archived; data packages that fail any test are forbidden in the repository. These tests are a measure data fitness, which ultimately increases confidence in data reuse and synthesis. The EDI data repository is recognized by multiple organizations, including EarthCube's Council of Data Facilities, the United States Geological Survey, FAIRsharing.org, re3data.org, and is a PLOS and Nature recommended data repository.

  10. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The Quarterly Environmental Data Summary (QEDS) for the fourth quarter of 1997 is prepared in support of the Weldon Spring Site Remedial Action Project Federal Facilities Agreement. The data presented constitute the QEDS. The data were received from the contract laboratories, verified by the Weldon Spring Site verification group and, except for air monitoring data and site KPA generated data (uranium analyses), merged into the data base during the fourth quarter of 1997. Air monitoring data presented are the most recent complete sets of quarterly data. Air data are not stored in the data base and KPA data are notmore » merged into the regular data base. Significant data, defined as data values that have exceeded defined ``above normal`` level 2 values, are discussed in this letter for Environmental Monitoring Plan (EMP) generated data only. Above normal level 2 values are based, in ES and H procedures, on historical high values, DOE Derived Concentration Guides (DCGs), NPDES limits and other guidelines. The procedures also establish actions to be taken in response to such data. Data received and verified during the fourth quarter were within a permissible range of variability except for those which are detailed.« less

  12. 78 FR 60003 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... month, per display application per Data Set \\5\\ of Real-Time \\6\\ TRACE transaction data. The fee waiver... data available in three Data Sets--the Corporate Bond Data Set, the Agency Data Set and the ABS Data Set. A fourth Data Set, the Rule 144A Data Set, will become available in 2014. See Securities Exchange...

  13. Attaining and maintaining data integrity with configuration management

    NASA Astrophysics Data System (ADS)

    Huffman, Dorothy J.; Jeane, Shirley A.

    1993-08-01

    Managers and scientists are concerned about data integrity because they draw conclusions from data that can have far reaching effects. Projects managers use Configuration Management to insure that hardware, software, and project information are controlled. They have not, as yet, applied its rigorously to data. However, there is ample opportunity in the data collection and production process to jeopardize data integrity. Environmental changes, tampering and production problems can all affect data integrity. There are four functions included in the Configuration Management process: configuration identification, control, auditing and status accounting. These functions provide management the means to attain data integrity and the visibility into engineering processes needed to maintain data integrity. When project managers apply Configuration Management processes to data, the data user can trace back through history to validate data integrity. The user knows that the project allowed only orderly changes to the data. He is assured that project personnel followed procedures to maintain data quality. He also has access to status information about the data. The user receives data products with a known integrity level and a means to assess the impact of past events ont he conclusions derived from the data. To obtain these benefits, project managers should apply the Configuration Management discipline to data.

  14. Metadata Repository for Improved Data Sharing and Reuse Based on HL7 FHIR.

    PubMed

    Ulrich, Hannes; Kock, Ann-Kristin; Duhm-Harbeck, Petra; Habermann, Jens K; Ingenerf, Josef

    2016-01-01

    Unreconciled data structures and formats are a common obstacle to the urgently required sharing and reuse of data within healthcare and medical research. Within the North German Tumor Bank of Colorectal Cancer, clinical and sample data, based on a harmonized data set, is collected and can be pooled by using a hospital-integrated Research Data Management System supporting biobank and study management. Adding further partners who are not using the core data set requires manual adaptations and mapping of data elements. Facing this manual intervention and focusing the reuse of heterogeneous healthcare instance data (value level) and data elements (metadata level), a metadata repository has been developed. The metadata repository is an ISO 11179-3 conformant server application built for annotating and mediating data elements. The implemented architecture includes the translation of metadata information about data elements into the FHIR standard using the FHIR Data Element resource with the ISO 11179 Data Element Extensions. The FHIR-based processing allows exchange of data elements with clinical and research IT systems as well as with other metadata systems. With increasingly annotated and harmonized data elements, data quality and integration can be improved for successfully enabling data analytics and decision support.

  15. Pilot climate data system user's guide

    NASA Technical Reports Server (NTRS)

    Reph, M. G.; Treinish, L. A.; Bloch, L.

    1984-01-01

    Instructions for using the Pilot Climate Data System (PCDS), an interactive, scientific data management system for locating, obtaining, manipulating, and displaying climate-research data are presented. The PCDS currently provides this supoort for approximately twenty data sets. Figures that illustrate the terminal displays which a user sees when he/she runs the PCDS and some examples of the output from this system are included. The capabilities which are described in detail allow a user to perform the following: (1) obtain comprehensive descriptions of a number of climate parameter data sets and the associated sensor measurements from which they were derived; (2) obtain detailed information about the temporal coverage and data volume of data sets which are readily accessible via the PCDS; (3) extract portions of a data set using criteria such as time range and geographic location, and output the data to tape, user terminal, system printer, or online disk files in a special data-set-independent format; (4) access and manipulate the data in these data-set-independent files, performing such functions as combining the data, subsetting the data, and averaging the data; and (5) create various graphical representations of the data stored in the data-set-independent files.

  16. Open Access to Geophysical Data

    NASA Astrophysics Data System (ADS)

    Sergeyeva, Nataliya A.; Zabarinskaya, Ludmila P.

    2017-04-01

    Russian World Data Centers for Solar-Terrestrial Physics & Solid Earth Physics hosted by the Geophysical Center of the Russian Academy of Sciences are the Regular Members of the ICSU-World Data System. Guided by the principles of the WDS Constitution and WDS Data Sharing Principles, the WDCs provide full and open access to data, long-term data stewardship, compliance with agreed-upon data standards and conventions, and mechanisms to facilitate and improve access to data. Historical and current geophysical data on different media, in the form of digital data sets, analog records, collections of maps, descriptions are stored and collected in the Centers. The WDCs regularly fill up repositories and database with new data, support them up to date. Now the WDCs focus on four new projects, aimed at increase of data available in network by retrospective data collection and digital preservation of data; creation of a modern system of registration and publication of data with digital object identifier (DOI) assignment, and promotion of data citation culture; creation of databases instead of file system for more convenient access to data; participation in the WDS Metadata Catalogue and Data Portal by creating of metadata for information resources of WDCs.

  17. [Infrastructure and contents of clinical data management plan].

    PubMed

    Shen, Tong; Xu, Lie-dong; Fu, Hai-jun; Liu, Yan; He, Jia; Chen, Ping-yan; Song, Yu-fei

    2015-11-01

    Establishment of quality management system (QMS) plays a critical role in the clinical data management (CDM). The objectives of CDM are to ensure the quality and integrity of the trial data. Thus, every stage or element that may impact the quality outcomes of clinical studies should be in the controlled manner, which is referred to the full life cycle of CDM associated with the data collection, handling and statistical analysis of trial data. Based on the QMS, this paper provides consensus on how to develop a compliant clinical data management plan (CDMP). According to the essential requirements of the CDM, the CDMP should encompass each process of data collection, data capture and cleaning, medical coding, data verification and reconciliation, database monitoring and management, external data transmission and integration, data documentation and data quality assurance and so on. Creating and following up data management plan in each designed data management steps, dynamically record systems used, actions taken, parties involved will build and confirm regulated data management processes, standard operational procedures and effective quality metrics in all data management activities. CDMP is one of most important data management documents that is the solid foundation for clinical data quality.

  18. Ascertaining severe perineal trauma and associated risk factors by comparing birth data with multiple sources.

    PubMed

    Ampt, Amanda J; Ford, Jane B

    2015-09-30

    Population data are often used to monitor severe perineal trauma trends and investigate risk factors. Within New South Wales (NSW), two different datasets can be used, the Perinatal Data Collection ('birth' data) or a linked dataset combining birth data with the Admitted Patient Data Collection ('hospital' data). Severe perineal trauma can be ascertained by birth data alone, or by hospital International Classification of Diseases Australian Modification (ICD-10-AM) diagnosis and procedure coding in the linked dataset. The aim of this study was to compare rates and risk factors for severe perineal trauma using birth data alone versus using linked data. The study population consisted of all vaginal births in NSW between 2001 and 2011. Perineal injury coding in birth data was revised in 2006, so data were analysed separately for 2001-06 and 2006-11. Rates of severe perineal injury over time were compared in birth data alone versus linked data. Kappa and agreement statistics were calculated. Risk factor distributions (maternal age, primiparity, instrumental birth, birthweight ≥4 kg, Asian country of birth and episiotomy) were compared between women with severe perineal trauma identified by birth data alone, and those identified by linked data. Multivariable logistic regression was used to calculate the adjusted odds ratios (aORs) of severe perineal trauma. Among 697 202 women with vaginal births, 2.1% were identified with severe perineal trauma by birth data alone, and 2.6% by linked data. The rate discrepancy was higher among earlier data (1.7% for birth data, 2.4% for linked data). Kappa for earlier data was 0.78 (95% CI 0.78, 0.79), and 0.89 (95% CI 0.89, 0.89) for more recent data. With the exception of episiotomy, differences in risk factor distributions were small, with similar aORs. The aOR of severe perineal trauma for episiotomy was higher using linked data (1.33, 95% CI 1.27, 1.40) compared with birth data (1.02, 95% CI 0.97, 1.08). Although discrepancies in ascertainment of severe perineal trauma improved after revision of birth data coding in 2006, higher ascertainment by linked data was still evident for recent data. There were also higher risk estimates of severe perineal trauma with episiotomy by linked data than by birth data.

  19. Supporting Data Stewardship Throughout the Data Life Cycle in the Solid Earth Sciences

    NASA Astrophysics Data System (ADS)

    Ferrini, V.; Lehnert, K. A.; Carbotte, S. M.; Hsu, L.

    2013-12-01

    Stewardship of scientific data is fundamental to enabling new data-driven research, and ensures preservation, accessibility, and quality of the data, yet researchers, especially in disciplines that typically generate and use small, but complex, heterogeneous, and unstructured datasets are challenged to fulfill increasing demands of properly managing their data. The IEDA Data Facility (www.iedadata.org) provides tools and services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds upon and brings together over a decade of development and experiences of its component data systems, the Marine Geoscience Data System (MGDS, www.marine-geo.org) and EarthChem (www.earthchem.org). IEDA services include domain-focused data curation and synthesis, tools for data discovery, access, visualization and analysis, as well as investigator support services that include tools for data contribution, data publication services, and data compliance support. IEDA data synthesis efforts (e.g. PetDB and Global Multi-Resolution Topography (GMRT) Synthesis) focus on data integration and analysis while emphasizing provenance and attribution. IEDA's domain-focused data catalogs (e.g. MGDS and EarthChem Library) provide access to metadata-rich long-tail data complemented by extensive metadata including attribution information and links to related publications. IEDA's visualization and analysis tools (e.g. GeoMapApp) broaden access to earth science data for domain specialist and non-specialists alike, facilitating both interdisciplinary research and education and outreach efforts. As a disciplinary data repository, a key role IEDA plays is to coordinate with its user community and to bridge the requirements and standards for data curation with both the evolving needs of its science community and emerging technologies. Development of IEDA tools and services is based first and foremost on the scientific needs of its user community. As data stewardship becomes a more integral component of the scientific workflow, IEDA investigator support services (e.g. Data Management Plan Tool and Data Compliance Reporting Tool) continue to evolve with the goal of lessening the 'burden' of data management for individual investigators by increasing awareness and facilitating the adoption of data management practices. We will highlight a variety of IEDA system components that support investigators throughout the data life cycle, and will discuss lessons learned and future directions.

  20. The Role of Interdisciplinary GIS and Data Curation Librarians in Enhancing Authentic Scientific Research in the Classroom

    NASA Astrophysics Data System (ADS)

    Branch, B. D.; Fosmire, M.

    2012-12-01

    Data science is a recently evolved area of scientific inquiry, where data, often collected by others, is analyzed by independent investigators to draw new conclusions. As such, data literacy needs to be incorporated into authentic research activities. The earth sciences in particular have a trove of data that resides in national data centers as well as individual investigators' labs, which can be repurposed to provide the inputs for students to make their own inquiries into the data. With the amount of data available, students can make more substantive conclusions than if relying just on data they've collected themselves. A new scientific role is that of the data scientist or data curation specialist. This person understands best practices in data and knowledge management and can translate those skills into an environment appropriate for K-20 students and teachers. In particular, data curation specialists can transform raw data into data that is audience appropriate that can be re-used. First, appropriate research data can be located, as well as foundational or baseline data (topography, political maps, etc.), and that data needs to be converted (or directions for conversion supplied) so that it can be ingested into the processing system used for the activity. Furthermore, data needs to be organized, especially as it is processed by students, and multiple versions of data created. Data also should be appropriately annotated to allow for effective sharing among students and determining reproducibility of the data. Finally, appropriate visualization of the data can be facilitated by a data curation specialist. To provide a concrete example, one of the authors developed, a data-driven authentic research project for a group of middle school students looking at water quality in a North Carolina community. Students needed to find relevant hydrologic, environmental, and political data as inputs for their project. They then collected local data to add to the standard data, so they could build a profile of water quality over time. Once the data had been appropriately collected, processed, and added, students could then develop queries to run against the data to evaluate their research questions. Simple statistical analysis was then run to determine the validity of their conclusions, and finally, presentations were developed to explain their results. Furthermore, students were empowered to connect the results of the research project to suggest policy changes for their community.

  1. Analysis of Human Mobility Based on Cellular Data

    NASA Astrophysics Data System (ADS)

    Arifiansyah, F.; Saptawati, G. A. P.

    2017-01-01

    Nowadays not only adult but even teenager and children have then own mobile phones. This phenomena indicates that the mobile phone becomes an important part of everyday’s life. Based on these indication, the amount of cellular data also increased rapidly. Cellular data defined as the data that records communication among mobile phone users. Cellular data is easy to obtain because the telecommunications company had made a record of the data for the billing system of the company. Billing data keeps a log of the users cellular data usage each time. We can obtained information from the data about communication between users. Through data visualization process, an interesting pattern can be seen in the raw cellular data, so that users can obtain prior knowledge to perform data analysis. Cellular data processing can be done using data mining to find out human mobility patterns and on the existing data. In this paper, we use frequent pattern mining and finding association rules to observe the relation between attributes in cellular data and then visualize them. We used weka tools for finding the rules in stage of data mining. Generally, the utilization of cellular data can provide supporting information for the decision making process and become a data support to provide solutions and information needed by the decision makers.

  2. An overview on integrated data system for archiving and sharing marine geology and geophysical data in Korea Institute of Ocean Science & Technology (KIOST)

    NASA Astrophysics Data System (ADS)

    Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa

    2016-04-01

    We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.

  3. Oceanographic Data in Europe: Minimal Effort for Data Providers, Maximal Ease of Use and Access for Data Users

    NASA Astrophysics Data System (ADS)

    De Bruin, T.

    2017-12-01

    SeaDataCloud/SeaDataNet (SDC/SDN) is both a consortium and a data infrastructure as well as a (series of) European oceanographic data management project(s), allowing data providers to store data at a data centre of their choice (usually a type of National Oceanographic Data Center), while exposing and making the data available for download via a chain of interconnected data portals at local, regional, pan-European and global levels. SDC/SDN as an infrastructure connects over 100 data centers from 35 countries in and around Europe. The infrastructure has been operational since early 2009 and provides the user an overview of all available data as well as the possibility to download the data in an uniform format. This presentation will give a short introduction to the SDC/SDN infrastructure and describe how its development was based on sound data management principles. The emphasis will be on how the system is interconnected with other, non-discipline specific (metadata) portals such as the Group of Earth Observations System of Systems (GEOSS), allowing oceanographic data stored at a local level in a data centre to be exposed at a global level to a wide audience from various disciplines.

  4. Apparatus And Method For Reconstructing Data Using Cross-Parity Stripes On Storage Media

    DOEpatents

    Hughes, James Prescott

    2003-06-17

    An apparatus and method for reconstructing missing data using cross-parity stripes on a storage medium is provided. The apparatus and method may operate on data symbols having sizes greater than a data bit. The apparatus and method makes use of a plurality of parity stripes for reconstructing missing data stripes. The parity symbol values in the parity stripes are used as a basis for determining the value of the missing data symbol in a data stripe. A correction matrix is shifted along the data stripes, correcting missing data symbols as it is shifted. The correction is performed from the outside data stripes towards the inner data stripes to thereby use previously reconstructed data symbols to reconstruct other missing data symbols.

  5. Bad data packet capture device

    DOEpatents

    Chen, Dong; Gara, Alan; Heidelberger, Philip; Vranas, Pavlos

    2010-04-20

    An apparatus and method for capturing data packets for analysis on a network computing system includes a sending node and a receiving node connected by a bi-directional communication link. The sending node sends a data transmission to the receiving node on the bi-directional communication link, and the receiving node receives the data transmission and verifies the data transmission to determine valid data and invalid data and verify retransmissions of invalid data as corresponding valid data. A memory device communicates with the receiving node for storing the invalid data and the corresponding valid data. A computing node communicates with the memory device and receives and performs an analysis of the invalid data and the corresponding valid data received from the memory device.

  6. Data pre-processing in record linkage to find the same companies from different databases

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Lubis, M. S.; Arisandi, D.; Azzahry, B.

    2018-03-01

    As public agencies, the Badan Pelayanan Perizinan Terpadu (BPPT) and the Badan Lingkungan Hidup (BLH) of Medan city manage process to obtain a business license from the public. However, each agency might have a different corporate data because of a separate data input process, even though the data may refer to the same company’s data. Therefore, it is required to identify and correlate data that refer to the same company which lie in different data sources. This research focuses on data pre-processing such as data cleaning, text pre-processing, indexing and record comparison. In addition, this research implements data matching using support vector machine algorithm. The result of this algorithm will be used to record linkage of data that can be used to identify and connect the company’s data based on the degree of similarity of each data. Previous data will be standardized in accordance with the format and structure appropriate to the stage of preprocessing data. After analyzing data pre-processing, we found that both database structures are not designed to support data integration. We decide that the data matching can be done with blocking criteria such as company name and the name of the owner (or applicant). In addition to data pre-processing, the result of data classification with a high level of similarity as many as 90 pairs of records.

  7. DataUp: Helping manage and archive data within the researcher's workflow

    NASA Astrophysics Data System (ADS)

    Strasser, C.

    2012-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are lacks of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. We have developed an open-source add-in for Excel and an open source web application intended to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. The researcher does not need a prior relationship with a data repository to use DataUp; the newly implemented ONEShare repository, a DataONE member node, is available for any researcher to archive and share their data. By meeting researchers where they already work, in spreadsheets, DataUp becomes part of the researcher's workflow and data management and sharing becomes easier. Future enhancement of DataUp will rely on members of the community adopting and adapting the DataUp tools to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between Microsoft Research Connections, the University of California's California Digital Library, the Gordon and Betty Moore Foundation, and DataONE.

  8. Multi-protocol header generation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, David A.; Ignatowski, Michael; Jayasena, Nuwan

    A communication device includes a data source that generates data for transmission over a bus, and a data encoder that receives and encodes outgoing data. An encoder system receives outgoing data from a data source and stores the outgoing data in a first queue. An encoder encodes outgoing data with a header type that is based upon a header type indication from a controller and stores the encoded data that may be a packet or a data word with at least one layered header in a second queue for transmission. The device is configured to receive at a payload extractor,more » a packet protocol change command from the controller and to remove the encoded data and to re-encode the data to create a re-encoded data packet and placing the re-encoded data packet in the second queue for transmission.« less

  9. SeaDataCloud - further developing the pan-European SeaDataNet infrastructure for marine and ocean data management

    NASA Astrophysics Data System (ADS)

    Schaap, Dick M. A.; Fichaut, Michele

    2017-04-01

    SeaDataCloud marks the third phase of developing the pan-European SeaDataNet infrastructure for marine and ocean data management. The SeaDataCloud project is funded by EU and runs for 4 years from 1st November 2016. It succeeds the successful SeaDataNet II (2011 - 2015) and SeaDataNet (2006 - 2011) projects. SeaDataNet has set up and operates a pan-European infrastructure for managing marine and ocean data and is undertaken by National Oceanographic Data Centres (NODC's) and oceanographic data focal points from 34 coastal states in Europe. The infrastructure comprises a network of interconnected data centres and central SeaDataNet portal. The portal provides users a harmonised set of metadata directories and controlled access to the large collections of datasets, managed by the interconnected data centres. The population of directories has increased considerably in cooperation with and involvement in many associated EU projects and initiatives such as EMODnet. SeaDataNet at present gives overview and access to more than 1.9 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 100 connected data centres from 34 countries riparian to European seas. SeaDataNet is also active in setting and governing marine data standards, and exploring and establishing interoperability solutions to connect to other e-infrastructures on the basis of standards of ISO (19115, 19139), and OGC (WMS, WFS, CS-W and SWE). Standards and associated SeaDataNet tools are made available at the SeaDataNet portal for wide uptake by data handling and managing organisations. SeaDataCloud aims at further developing standards, innovating services & products, adopting new technologies, and giving more attention to users. Moreover, it is about implementing a cooperation between the SeaDataNet consortium of marine data centres and the EUDAT consortium of e-infrastructure service providers. SeaDataCloud aims at considerably advancing services and increasing their usage by adopting cloud and High Performance Computing technology. SeaDataCloud will empower researchers with a packaged collection of services and tools, tailored to their specific needs, supporting research and enabling generation of added-value products from marine and ocean data. Substantial activities will be focused on developing added-value services, such as data subsetting, analysis, visualisation, and publishing workflows for users, both regular and advanced users, as part of a Virtual Research Environment (VRE). SeaDataCloud aims at a number of leading user communities that have new challenges for upgrading and expanding the SeaDataNet standards and services: Science, EMODnet, Copernicus Marine Environmental Monitoring Service (CMEMS) and EuroGOOS, and International scientific programmes. The presentation will give information on present services of the SeaDataNet infrastructure and services, and the new challenges in SeaDataCloud, and will highlight a number of key achievements in SeaDataCloud so far.

  10. Simulation of EO-1 Hyperion Data from ALI Multispectral Data Based on the Spectral Reconstruction Approach

    PubMed Central

    Liu, Bo; Zhang, Lifu; Zhang, Xia; Zhang, Bing; Tong, Qingxi

    2009-01-01

    Data simulation is widely used in remote sensing to produce imagery for a new sensor in the design stage, for scale issues of some special applications, or for testing of novel algorithms. Hyperspectral data could provide more abundant information than traditional multispectral data and thus greatly extend the range of remote sensing applications. Unfortunately, hyperspectral data are much more difficult and expensive to acquire and were not available prior to the development of operational hyperspectral instruments, while large amounts of accumulated multispectral data have been collected around the world over the past several decades. Therefore, it is reasonable to examine means of using these multispectral data to simulate or construct hyperspectral data, especially in situations where hyperspectral data are necessary but hard to acquire. Here, a method based on spectral reconstruction is proposed to simulate hyperspectral data (Hyperion data) from multispectral Advanced Land Imager data (ALI data). This method involves extraction of the inherent information of source data and reassignment to newly simulated data. A total of 106 bands of Hyperion data were simulated from ALI data covering the same area. To evaluate this method, we compare the simulated and original Hyperion data by visual interpretation, statistical comparison, and classification. The results generally showed good performance of this method and indicated that most bands were well simulated, and the information both preserved and presented well. This makes it possible to simulate hyperspectral data from multispectral data for testing the performance of algorithms, extend the use of multispectral data and help the design of a virtual sensor. PMID:22574064

  11. Big Data and medicine: a big deal?

    PubMed

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  12. Nursing Needs Big Data and Big Data Needs Nursing.

    PubMed

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  13. Restoration of Apollo Data for Future Lunar Exploration

    NASA Astrophysics Data System (ADS)

    Schultz, Alfred B.; Williams, D. R.; Hills, H. K.

    2007-10-01

    The Lunar Data Project (LDP) at NASA's National Space Science Data Center (NSSDC) is retrieving and restoring relevant, scientifically important Apollo data into accessible digital form for use by researchers and mission planners. Much of the Apollo data housed at the NSSDC are in forms which are not readily usable, such as microfilm, hardcopy, and magnetic tapes written using machine representations of computers no longer in use. The LDP has prioritized these data based on scientific and engineering value and level of effort required and is in the process of restoring these data collections. In association with the Planetary Data System (PDS), the restored data are converted into standard format and subject to a data peer review before ingestion into PDS. The Apollo 12 and 15 Solar Wind Spectrometer data have been restored and are awaiting data review. The Apollo 14 and 15 ALSEP Cold Cathode Ion Gage data have been scanned, the Apollo 14 Dust, Thermal, and Radiation Engineering Measurements data are in the process of being scanned, and the Apollo 14 Charged Particle Lunar Environment Experiment data have been retrieved from magnetic tape. An optical character recognition software to produce digital tables of the scanned data, where appropriate, is under development. These data represent some of the only long-term lunar surface environment information that exists. We will report on our progress. Metadata, ancillary information to aid in the use and understanding of the data, will be included in these online data collections. These cover complete descriptions of the data sets, formats, processing history, relevant references and contacts, and instrument descriptions. Restored data and associated metadata are posted online and easily accessible to interested users. The data sets and more information on the LDP can be found at nssdc.gsfc.nasa.gov/planetary/lunar/lunar_data/

  14. Public-Private Partnership: Joint recommendations to improve downloads of large Earth observation data

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Murphy, K. J.; Baynes, K.; Lynnes, C.

    2016-12-01

    With the volume of Earth observation data expanding rapidly, cloud computing is quickly changing the way Earth observation data is processed, analyzed, and visualized. The cloud infrastructure provides the flexibility to scale up to large volumes of data and handle high velocity data streams efficiently. Having freely available Earth observation data collocated on a cloud infrastructure creates opportunities for innovation and value-added data re-use in ways unforeseen by the original data provider. These innovations spur new industries and applications and spawn new scientific pathways that were previously limited due to data volume and computational infrastructure issues. NASA, in collaboration with Amazon, Google, and Microsoft, have jointly developed a set of recommendations to enable efficient transfer of Earth observation data from existing data systems to a cloud computing infrastructure. The purpose of these recommendations is to provide guidelines against which all data providers can evaluate existing data systems and be used to improve any issues uncovered to enable efficient search, access, and use of large volumes of data. Additionally, these guidelines ensure that all cloud providers utilize a common methodology for bulk-downloading data from data providers thus preventing the data providers from building custom capabilities to meet the needs of individual cloud providers. The intent is to share these recommendations with other Federal agencies and organizations that serve Earth observation to enable efficient search, access, and use of large volumes of data. Additionally, the adoption of these recommendations will benefit data users interested in moving large volumes of data from data systems to any other location. These data users include the cloud providers, cloud users such as scientists, and other users working in a high performance computing environment who need to move large volumes of data.

  15. Real-time GIS data model and sensor web service platform for environmental data management.

    PubMed

    Gong, Jianya; Geng, Jing; Chen, Zeqiang

    2015-01-09

    Effective environmental data management is meaningful for human health. In the past, environmental data management involved developing a specific environmental data management system, but this method often lacks real-time data retrieving and sharing/interoperating capability. With the development of information technology, a Geospatial Service Web method is proposed that can be employed for environmental data management. The purpose of this study is to determine a method to realize environmental data management under the Geospatial Service Web framework. A real-time GIS (Geographic Information System) data model and a Sensor Web service platform to realize environmental data management under the Geospatial Service Web framework are proposed in this study. The real-time GIS data model manages real-time data. The Sensor Web service platform is applied to support the realization of the real-time GIS data model based on the Sensor Web technologies. To support the realization of the proposed real-time GIS data model, a Sensor Web service platform is implemented. Real-time environmental data, such as meteorological data, air quality data, soil moisture data, soil temperature data, and landslide data, are managed in the Sensor Web service platform. In addition, two use cases of real-time air quality monitoring and real-time soil moisture monitoring based on the real-time GIS data model in the Sensor Web service platform are realized and demonstrated. The total time efficiency of the two experiments is 3.7 s and 9.2 s. The experimental results show that the method integrating real-time GIS data model and Sensor Web Service Platform is an effective way to manage environmental data under the Geospatial Service Web framework.

  16. CB4-03: An Eye on the Future: A Review of Data Virtualization Techniques to Improve Research Analytics

    PubMed Central

    Richter, Jack; McFarland, Lela; Bredfeldt, Christine

    2012-01-01

    Background/Aims Integrating data across systems can be a daunting process. The traditional method of moving data to a common location, mapping fields with different formats and meanings, and performing data cleaning activities to ensure valid and reliable integration across systems can be both expensive and extremely time consuming. As the scope of needed research data increases, the traditional methodology may not be sustainable. Data Virtualization provides an alternative to traditional methods that may reduce the effort required to integrate data across disparate systems. Objective Our goal was to survey new methods in data integration, cloud computing, enterprise data management and virtual data management for opportunities to increase the efficiency of producing VDW and similar data sets. Methods Kaiser Permanente Information Technology (KPIT), in collaboration with the Mid-Atlantic Permanente Research Institute (MAPRI) reviewed methodologies in the burgeoning field of Data Virtualization. We identified potential strengths and weaknesses of new approaches to data integration. For each method, we evaluated its potential application for producing effective research data sets. Results Data Virtualization provides opportunities to reduce the amount of data movement required to integrate data sources on different platforms in order to produce research data sets. Additionally, Data Virtualization also includes methods for managing “fuzzy” matching used to match fields known to have poor reliability such as names, addresses and social security numbers. These methods could improve the efficiency of integrating state and federal data such as patient race, death, and tumors with internal electronic health record data. Discussion The emerging field of Data Virtualization has considerable potential for increasing the efficiency of producing research data sets. An important next step will be to develop a proof of concept project that will help us understand to benefits and drawbacks of these techniques.

  17. 15 CFR 718.1 - Definition.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...(g)(1) and 304(e)(2) of the Act and other trade secrets as follows: (a) Financial data; (b) Sales and marketing data (other than shipment data); (c) Pricing data; (d) Personnel data; (e) Research data; (f) Patent data; (g) Data maintained for compliance with environmental or occupational health and safety...

  18. 15 CFR 718.1 - Definition.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...(g)(1) and 304(e)(2) of the Act and other trade secrets as follows: (a) Financial data; (b) Sales and marketing data (other than shipment data); (c) Pricing data; (d) Personnel data; (e) Research data; (f) Patent data; (g) Data maintained for compliance with environmental or occupational health and safety...

  19. 15 CFR 718.1 - Definition.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...(g)(1) and 304(e)(2) of the Act and other trade secrets as follows: (a) Financial data; (b) Sales and marketing data (other than shipment data); (c) Pricing data; (d) Personnel data; (e) Research data; (f) Patent data; (g) Data maintained for compliance with environmental or occupational health and safety...

  20. 15 CFR 718.1 - Definition.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...(g)(1) and 304(e)(2) of the Act and other trade secrets as follows: (a) Financial data; (b) Sales and marketing data (other than shipment data); (c) Pricing data; (d) Personnel data; (e) Research data; (f) Patent data; (g) Data maintained for compliance with environmental or occupational health and safety...

  1. Big Data: You Are Adding to . . . and Using It

    ERIC Educational Resources Information Center

    Makela, Carole J.

    2016-01-01

    "Big data" prompts a whole lexicon of terms--data flow; analytics; data mining; data science; smart you name it (cars, houses, cities, wearables, etc.); algorithms; learning analytics; predictive analytics; data aggregation; data dashboards; digital tracks; and big data brokers. New terms are being coined frequently. Are we paying…

  2. 14 CFR 125.228 - Flight data recorders: filtered data.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Flight data recorders: filtered data. 125... Equipment Requirements § 125.228 Flight data recorders: filtered data. (a) A flight data signal is filtered... original sensor signal value can be reconstructed from the recorded data. This demonstration requires that...

  3. Enabling Data-as- a-Service (DaaS) - Biggest Challenge of Geoscience Australia

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Kemp, C.; Car, N. J.

    2016-12-01

    Geoscience Australia (GA) is recognised and respected as the national repository and steward of multiple national significance data collections that provides geoscience information, services and capability to the Australian Government, industry and stakeholders. Provision of Data-as-a-Service is both GA's key responsibility and core business. Through the Science First Transformation Program GA is undergoing a significant rethinking of its data architecture, curation and access to support the Digital Science capability for which DaaS forms both a dependency and underpins its implementation. DaaS, being a service, means we can deliver its outputs in multiple ways thus providing users with data on demand in ready-for-consumption forms. We can then to reuse prebuilt data constructions to allow self-serviced integration of data underpinned by dynamic query tools. In GA's context examples of DaaS are the Australian Geoscience Data Cube, the Foundation Spatial Data Framework and data served through several Virtual Laboratories. We have implemented a three-layered architecture for DaaS in order to store and manage the data while honouring the semantics of Scientific Data Models defined by subject matter experts and GA's Enterprise Data Architecture as well as retain that delivery flexibility. The foundation layer of DaaS is Canonical Datasets, which are optimised for a long-term data stewardship and curation. Data is well structured, standardised, described and audited. All data creation and editing happen within this layer. The middle Data Transformation layer assists with transformation of data from Canonical Datasets to data integration layer. It provides mechanisms for multi-format and multi-technology data transformation. The top Data Integration layer is optimised for data access. Data can be easily reused and repurposed; data formats made available are optimised for scientific computing and adjusted for access by multiple applications, tools and libraries. Moving to DaaS enables GA to increase data alertness, generate new capabilities and be prepared for emerging technological challengers.

  4. The National Center for Atmospheric Research (NCAR) Research Data Archive: a Data Education Center

    NASA Astrophysics Data System (ADS)

    Peng, G. S.; Schuster, D.

    2015-12-01

    The National Center for Atmospheric Research (NCAR) Research Data Archive (RDA), rda.ucar.edu, is not just another data center or data archive. It is a data education center. We not only serve data, we TEACH data. Weather and climate data is the original "Big Data" dataset and lessons learned while playing with weather data are applicable to a wide range of data investigations. Erroneous data assumptions are the Achilles heel of Big Data. It doesn't matter how much data you crunch if the data is not what you think it is. Each dataset archived at the RDA is assigned to a data specialist (DS) who curates the data. If a user has a question not answered in the dataset information web pages, they can call or email a skilled DS for further clarification. The RDA's diverse staff—with academic training in meteorology, oceanography, engineering (electrical, civil, ocean and database), mathematics, physics, chemistry and information science—means we likely have someone who "speaks your language." Data discovery is another difficult Big Data problem; one can only solve problems with data if one can find the right data. Metadata, both machine and human-generated, underpin the RDA data search tools. Users can quickly find datasets by name or dataset ID number. They can also perform a faceted search that successively narrows the options by user requirements or simply kick off an indexed search with a few words. Weather data formats can be difficult to read for non-expert users; it's usually packed in binary formats requiring specialized software and parameter names use specialized vocabularies. DSs create detailed information pages for each dataset and maintain lists of helpful software, documentation and links of information around the web. We further grow the level of sophistication of the users with tips, tutorials and data stories on the RDA Blog, http://ncarrda.blogspot.com/. How-to video tutorials are also posted on the NCAR Computational and Information Systems Laboratory (CISL) YouTube channel.

  5. Using Feedback from Data Consumers to Capture Quality Information on Environmental Research Data

    NASA Astrophysics Data System (ADS)

    Devaraju, A.; Klump, J. F.

    2015-12-01

    Data quality information is essential to facilitate reuse of Earth science data. Recorded quality information must be sufficient for other researchers to select suitable data sets for their analysis and confirm the results and conclusions. In the research data ecosystem, several entities are responsible for data quality. Data producers (researchers and agencies) play a major role in this aspect as they often include validation checks or data cleaning as part of their work. It is possible that the quality information is not supplied with published data sets; if it is available, the descriptions might be incomplete, ambiguous or address specific quality aspects. Data repositories have built infrastructures to share data, but not all of them assess data quality. They normally provide guidelines of documenting quality information. Some suggests that scholarly and data journals should take a role in ensuring data quality by involving reviewers to assess data sets used in articles, and incorporating data quality criteria in the author guidelines. However, this mechanism primarily addresses data sets submitted to journals. We believe that data consumers will complement existing entities to assess and document the quality of published data sets. This has been adopted in crowd-source platforms such as Zooniverse, OpenStreetMap, Wikipedia, Mechanical Turk and Tomnod. This paper presents a framework designed based on open source tools to capture and share data users' feedback on the application and assessment of research data. The framework comprises a browser plug-in, a web service and a data model such that feedback can be easily reported, retrieved and searched. The feedback records are also made available as Linked Data to promote integration with other sources on the Web. Vocabularies from Dublin Core and PROV-O are used to clarify the source and attribution of feedback. The application of the framework is illustrated with the CSIRO's Data Access Portal.

  6. Data Citation Impediments: Human and Institutional Inertia

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.

    2013-12-01

    Data citations are growing in visibility in scientific and public policy circles. Data citations directly link scholarship and data, and as such provide a mechanism through which data can be discovered and accessed, scholarly use of data can be tracked, and the impact of data facilities can be identified. The interest in data citations is coming from many research stakeholders, including funders, policy makers, professional societies and their publication entities, research organizations, and individual researchers. Most of the efforts to date around data citations have focused on the challenges of assigning unique identifiers to digital data sets. While these challenges are significant, an additional challenge has gone relatively unaddressed, namely, the fact that data citation is not a common practice within scientific communities. This presentation will present findings from an interview study within the University Corporation for Atmospheric Research / National Center for Atmospheric Research (UCAR/NCAR). Through interviews with 14 scientists and engineers, we have found that there is little evidence that data citations have gained momentum as a common practice. Currently, data users acknowledge their use of particular data sets in either the research methods or acknowledgements sections of their papers, not as formal citations in a paper's bibliography. Data users are often 1) unaware that they can and should cite data sets, 2) unsure of how to cite data sets, and 3) lacking career motivations to forward data citations as a common activity. Data citation initiatives will have minimal impact on the scientific community if they do not address this practical inertia. Data users are a critical stakeholder in the data citation process. Their voice needs to be central to the data citation discussion. We will discuss how outreach efforts need to focus on raising the profile of data citations by informing scientists and administrators, being proactive in providing data users with recommended citations, and embedding data citations within larger scientific research institutions like academic tenure and scholarly peer review.

  7. The data facility of the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS)

    NASA Technical Reports Server (NTRS)

    Nielsen, Pia J.; Green, Robert O.; Murray, Alex T.; Eng, Bjorn T.; Novack, H. Ian; Solis, Manuel; Olah, Martin

    1993-01-01

    AVIRIS operations at the Jet Propulsion Laboratory include a significant data task. The AVIRIS data facility is responsible for data archiving, data calibration, quality monitoring and distribution. Since 1987, the data facility has archived over one terabyte of AVIRIS data and distributed these data to science investigators as requested. In this paper we describe recent improvements in the AVIRIS data facility.

  8. Data Mining and Homeland Security: An Overview

    DTIC Science & Technology

    2006-01-27

    which government agencies should use and mix commercial data with government data, whether data sources are being used for purposes other than those...example, a hardware store may compare their customers’ tool purchases with home ownership, type of CRS-2 3 John Makulowich, “ Government Data Mining...cleaning, data integration, data selection, data transformation , (data mining), pattern evaluation, and knowledge presentation.4 A number of advances in

  9. Transforming Research Data into Resource Data

    NASA Astrophysics Data System (ADS)

    Chandler, C. L.; Shepherd, A.; Groman, R. C.; Kinkade, D.; Rauch, S.; Allison, M. D.; Copley, N. J.; Ake, H.; York, A.; Wiebe, P. H.; Glover, D. M.

    2016-12-01

    Many of the Grand Challenge science questions are of interest to the marine science research community funded by the United States National Science Foundation (NSF). The highly diverse range of environmental data from the oceans, coastal regions, and Great Lakes are collected using a variety of platforms, instrument systems and sensors and are complemented by experimental results including sequence data, and model results. The data are often collected with a particular research purpose in mind. Such data are costly to acquire and environmental data, temporally and geographically unique, cannot be acquired again. The NSF-funded research community comprising funded investigators and their research teams, operators of the US academic research fleet, data managers, marine librarians, and NSF program managers are working together to transform `research data' into `resource data'. The objective is to ensure that the original research data become available to a much wider community, and have potential to be used as `resource data' for new and different types of research well beyond the initial focus of the NSF research grant. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) manages a community-driven data repository that serves some of these data: the data and results from research funded by NSF Ocean Sciences and Polar Programs. Individually such data sets are typically small in size, but when integrated these data become a valuable resource for the global research effort. The data are analyzed, quality controlled, finalized by the original investigators and their research teams, and then contributed to BCO-DMO. The BCO-DMO data managers reformat the data if they were submitted in proprietary formats, perform quality assessment review, augment the data sets with additional documentation, and create structured, machine-actionable metadata. The BCO-DMO data system allows researchers to make connections between related data sets within the BCO-DMO catalog, and also to follow links to complementary data sets curated at other research data repositories. The key is to expose, in standards compliant ways, essential elements of domain-specific metadata that enable discovery of related data, results, products, and publications from scientific research activities.

  10. re3data.org - a global registry of research data repositories

    NASA Astrophysics Data System (ADS)

    Pampel, Heinz; Vierkant, Paul; Elger, Kirsten; Bertelmann, Roland; Witt, Michael; Schirmbacher, Peter; Rücknagel, Jessika; Kindling, Maxi; Scholze, Frank; Ulrich, Robert

    2016-04-01

    re3data.org - the registry of research data repositories lists over 1,400 research data repositories from all over the world making it the largest and most comprehensive online catalog of research data repositories on the web. The registry is a valuable tool for researchers, funding organizations, publishers and libraries. re3data.org provides detailed information about research data repositories, and its distinctive icons help researchers to easily identify relevant repositories for accessing and depositing data sets [1]. Funding agencies, like the European Commission [2] and research institutions like the University of Bielefeld [3] already recommend the use of re3data.org in their guidelines and policies. Several publishers and journals like Copernicus Publications, PeerJ, and Nature's Scientific Data recommend re3data.org in their editorial policies as a tool for the easy identification of appropriate data repositories to store research data. Project partners in re3data.org are the Library and Information Services department (LIS) of the GFZ German Research Centre for Geosciences, the Computer and Media Service at the Humboldt-Universität zu Berlin, the Purdue University Libraries and the KIT Library at the Karlsruhe Institute of Technology (KIT). After its fusion with the U.S. American DataBib in 2014, re3data.org continues as a service of DataCite from 2016 on. DataCite is the international organization for the registration of Digital Object Identifiers (DOI) for research data and aims to improve their citation. The poster describes the current status and the future plans of re3data.org. [1] Pampel H, et al. (2013) Making Research Data Repositories Visible: The re3data.org Registry. PLoS ONE 8(11): e78080. doi:10.1371/journal.pone.0078080. [2] European Commission (2015): Guidelines on Open Access to Scientific Publications and Research Data in Horizon 2020. Available: http://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-pilot-guide_en.pdf Accessed 11 January 2016. [3] Bielefeld University (2013): Resolution on Research Data Management. Available: http://data.uni-bielefeld.de/en/resolution Accessed 11 January 2016.

  11. SeaDataNet - Pan-European infrastructure for marine and ocean data management: Unified access to distributed data sets (www.seadatanet.org)

    NASA Astrophysics Data System (ADS)

    Schaap, Dick M. A.; Maudire, Gilbert

    2010-05-01

    SeaDataNet is a leading infrastructure in Europe for marine & ocean data management. It is actively operating and further developing a Pan-European infrastructure for managing, indexing and providing access to ocean and marine data sets and data products, acquired via research cruises and other observational activities, in situ and remote sensing. The basis of SeaDataNet is interconnecting 40 National Oceanographic Data Centres and Marine Data Centers from 35 countries around European seas into a distributed network of data resources with common standards for metadata, vocabularies, data transport formats, quality control methods and flags, and access. Thereby most of the NODC's operate and/or are developing national networks to other institutes in their countries to ensure national coverage and long-term stewardship of available data sets. The majority of data managed by SeaDataNet partners concerns physical oceanography, marine chemistry, hydrography, and a substantial volume of marine biology and geology and geophysics. These are partly owned by the partner institutes themselves and for a major part also owned by other organizations from their countries. The SeaDataNet infrastructure is implemented with support of the EU via the EU FP6 SeaDataNet project to provide the Pan-European data management system adapted both to the fragmented observation system and the users need for an integrated access to data, meta-data, products and services. The SeaDataNet project has a duration of 5 years and started in 2006, but builds upon earlier data management infrastructure projects, undertaken over a period of 20 years by an expanding network of oceanographic data centres from the countries around all European seas. Its predecessor project Sea-Search had a strict focus on metadata. SeaDataNet maintains significant interest in the further development of the metadata infrastructure, extending its services with the provision of easy data access and generic data products. Version 1 of its infrastructure upgrade was launched in April 2008 and is now well underway to include all 40 data centres at V1 level. It comprises the network of 40 interconnected data centres (NODCs) and a central SeaDataNet portal. V1 provides users a unified and transparent overview of the metadata and controlled access to the large collections of data sets, that are managed at these data centres. The SeaDataNet V1 infrastructure comprises the following middleware services: • Discovery services = Metadata directories and User interfaces • Vocabulary services = Common vocabularies and Governance • Security services = Authentication, Authorization & Accounting • Delivery services = Requesting and Downloading of data sets • Viewing services = Mapping of metadata • Monitoring services = Statistics on system usage and performance and Registration of data requests and transactions • Maintenance services = Entry and updating of metadata by data centres Also good progress is being made with extending the SeaDataNet infrastructure with V2 services: • Viewing services = Quick views and Visualisation of data and data products • Product services = Generic and standard products • Exchange services = transformation of SeaDataNet portal CDI output to INSPIRE compliance As a basis for the V1 services, common standards have been defined for metadata and data formats, common vocabularies, quality flags, and quality control methods, based on international standards, such as ISO 19115, OGC, NetCDF (CF), ODV, best practices from IOC and ICES, and following INSPIRE developments. An important objective of the SeaDataNet V1 infrastructure is to provide transparent access to the distributed data sets via a unique user interface and download service. In the SeaDataNet V1 architecture the Common Data Index (CDI) V1 metadata service provides the link between discovery and delivery of data sets. The CDI user interface enables users to have a detailed insight of the availability and geographical distribution of marine data, archived at the connected data centres. It provides sufficient information to allow the user to assess the data relevance. Moreover the CDI user interface provides the means for downloading data sets in common formats via a transaction mechanism. The SeaDataNet portal provides registered users access to these distributed data sets via the CDI V1 Directory and a shopping basket mechanism. This allows registered users to locate data of interest and submit their data requests. The requests are forwarded automatically from the portal to the relevant SeaDataNet data centres. This process is controlled via the Request Status Manager (RSM) Web Service at the portal and a Download Manager (DM) java software module, implemented at each of the data centres. The RSM also enables registered users to check regularly the status of their requests and download data sets, after access has been granted. Data centres can follow all transactions for their data sets online and can handle requests which require their consent. The actual delivery of data sets is done between the user and the selected data centre. Very good progress is being made with connecting all SeaDataNet data centres and their data sets to the CDI V1 system. At present the CDI V1 system provides users functionality to discover and download more than 500.000 data sets, a number which is steadily increasing. The SeaDataNet architecture provides a coherent system of the various V1 services and inclusion of the V2 services. For the implementation, a range of technical components have been defined and developed. These make use of recent web technologies, and also comprise Java components, to provide multi-platform support and syntactic interoperability. To facilitate sharing of resources and interoperability, SeaDataNet has adopted the technology of SOAP Web services for various communication tasks. The SeaDataNet architecture has been designed as a multi-disciplinary system from the beginning. It is able to support a wide variety of data types and to serve several sector communities. SeaDataNet is willing to share its technologies and expertise, to spread and expand its approach, and to build bridges to other well established infrastructures in the marine domain. Therefore SeaDataNet has developed a strategy of seeking active cooperation on a national scale with other data holding organisations via its NODC networks and on an international scale with other European and international data management initiatives and networks. This is done with the objective to achieve a wider coverage of data sources and an overall interoperability between data infrastructures in the marine and ocean domains. Recent examples are e.g. the EU FP7 projects Geo-Seas for geology and geophysical data sets, UpgradeBlackSeaScene for a Black Sea data management infrastructure, CaspInfo for a Caspian Sea data management infrastructure, the EU EMODNET pilot projects, for hydrographic, chemical, and biological data sets. All projects are adopting the SeaDataNet standards and extending its services. Also active cooperation takes place with EuroGOOS and MyOcean in the domain of real-time and delayed mode metocean monitoring data. SeaDataNet Partners: IFREMER (France), MARIS (Netherlands), HCMR/HNODC (Greece), ULg (Belgium), OGS (Italy), NERC/BODC (UK), BSH/DOD (Germany), SMHI (Sweden), IEO (Spain), RIHMI/WDC (Russia), IOC (International), ENEA (Italy), INGV (Italy), METU (Turkey), CLS (France), AWI (Germany), IMR (Norway), NERI (Denmark), ICES (International), EC-DG JRC (International), MI (Ireland), IHPT (Portugal), RIKZ (Netherlands), RBINS/MUMM (Belgium), VLIZ (Belgium), MRI (Iceland), FIMR (Finland ), IMGW (Poland), MSI (Estonia), IAE/UL (Latvia), CMR (Lithuania), SIO/RAS (Russia), MHI/DMIST (Ukraine), IO/BAS (Bulgaria), NIMRD (Romania), TSU (Georgia), INRH (Morocco), IOF (Croatia), PUT (Albania), NIB (Slovenia), UoM (Malta), OC/UCY (Cyprus), IOLR (Israel), NCSR/NCMS (Lebanon), CNR-ISAC (Italy), ISMAL (Algeria), INSTM (Tunisia)

  12. Data warehouse model design technology analysis and research

    NASA Astrophysics Data System (ADS)

    Jiang, Wenhua; Li, Qingshui

    2012-01-01

    Existing data storage format can not meet the needs of information analysis, data warehouse onto the historical stage, the data warehouse is to support business decision making and the creation of specially designed data collection. With the data warehouse, the companies will all collected information is stored in the data warehouse. The data warehouse is organized according to some, making information easy to access and has value. This paper focuses on the establishment of data warehouse and analysis, design, data warehouse, two barrier models, and compares them.

  13. Marine asset security and tracking (MAST) system

    DOEpatents

    Hanson, Gregory Richard [Clinton, TN; Smith, Stephen Fulton [Loudon, TN; Moore, Michael Roy [Corryton, TN; Dobson, Eric Lesley [Charleston, SC; Blair, Jeffrey Scott [Charleston, SC; Duncan, Christopher Allen [Marietta, GA; Lenarduzzi, Roberto [Knoxville, TN

    2008-07-01

    Methods and apparatus are described for marine asset security and tracking (MAST). A method includes transmitting identification data, location data and environmental state sensor data from a radio frequency tag. An apparatus includes a radio frequency tag that transmits identification data, location data and environmental state sensor data. Another method includes transmitting identification data and location data from a radio frequency tag using hybrid spread-spectrum modulation. Another apparatus includes a radio frequency tag that transmits both identification data and location data using hybrid spread-spectrum modulation.

  14. Development of an Oceanographic Data Archiving and Service System for the Korean Researchers

    NASA Astrophysics Data System (ADS)

    Kim, Sung Dae; Park, Hyuk Min; Baek, Sang Ho

    2014-05-01

    Oceanographic Data and Information Center of Korea Institute of Ocean Science and Technology (KIOST) started to develop an oceanographic data archiving and service system in 2010 to support the Korean ocean researchers by providing quality controlled data continuously. Many physical oceanographic data available in the public domain and Korean domestic data were collected periodically, quality controlled, manipulated and provided to ocean modelers who need ocean data continuously and marine biologists who don't know well physical data but need it. The northern limit and the southern limit of the spatial coverage are 20°N and 55°N, and the western limit and the eastern limit are 110°E and 150°E, respectively. To archive TS (Temperature and Salinity) profile data, ARGO data were gathered from ARGO GDACs (France and USA) and many historical TS profile data observed by CTD, OSD and BT were retrieved from World Ocean Database 2009. The quality control software for TS profile data, which meets QC criteria suggested by the ARGO program and the GTSPP (Global Temperature-Salinity Profile Program), was programmed and applied to the collected data. By the end of 2013, the total number of vertical profile data from the ARGO GDACs was 59,642 and total number of station data from WOD 2009 was 1,604,422. We also collected the global satellite SST data produced by NCDC and global SSH data from AVISO every day. An automatic program was coded to collect satellite data, extract sub data sets of the North West Pacific area and produce distribution maps. The total number of collected satellite data sets was 3,613 by the end of 2013. We use 3 different data services to provide archived data to the Korean experts. A FTP service was prepared to allow data users to download data in the original format. We developed TS database system using Oracle RDBMS to contain all collected temperature salinity data and support SQL data retrieval with various conditions. The KIOST ocean data portal was used as the data retrieving service of TS DB, which uses GIS interface made by open source GIS software. We also installed Live Access Service developed by US PMEL for service of the satellite netCDF data files, which support on-the-fly visualization and OPeNDAP (Open-source Project for a Network Data Access Protocol) service for remote connection and sub-setting of large data set

  15. A Survey on Next-generation Power Grid Data Architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, Shutang; Zhu, Dr. Lin; Liu, Yong

    2015-01-01

    The operation and control of power grids will increasingly rely on data. A high-speed, reliable, flexible and secure data architecture is the prerequisite of the next-generation power grid. This paper summarizes the challenges in collecting and utilizing power grid data, and then provides reference data architecture for future power grids. Based on the data architecture deployment, related research on data architecture is reviewed and summarized in several categories including data measurement/actuation, data transmission, data service layer, data utilization, as well as two cross-cutting issues, interoperability and cyber security. Research gaps and future work are also presented.

  16. System using data compression and hashing adapted for use for multimedia encryption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffland, Douglas R

    2011-07-12

    A system and method is disclosed for multimedia encryption. Within the system of the present invention, a data compression module receives and compresses a media signal into a compressed data stream. A data acquisition module receives and selects a set of data from the compressed data stream. And, a hashing module receives and hashes the set of data into a keyword. The method of the present invention includes the steps of compressing a media signal into a compressed data stream; selecting a set of data from the compressed data stream; and hashing the set of data into a keyword.

  17. Data Quality- and Master Data Management - A Hospital Case.

    PubMed

    Arthofer, Klaus; Girardi, Dominic

    2017-01-01

    Poor data quality prevents the analysis of data for decisions which are critical for business. It also has a negative impact on business processes. Nevertheless the maturity level of data quality- and master data management is still insufficient in many organizations nowadays. This article discusses the corresponding maturity of companies and a management cycle integrating data quality- and master data management in a case dealing with benchmarking in hospitals. In conclusion if data quality and master data are not properly managed, structured data should not be acquired in the first place due to the added expense and complexity.

  18. Concept for Future Data Services at the Long-Term Archive of WDCC combining DOIs with common PIDs

    NASA Astrophysics Data System (ADS)

    Stockhause, Martina; Weigel, Tobias; Toussaint, Frank; Höck, Heinke; Thiemann, Hannes; Lautenschlager, Michael

    2013-04-01

    The World Data Center for Climate (WDCC) hosted at the German Climate Computing Center (DKRZ) maintains a long-term archive (LTA) of climate model data as well as observational data. WDCC distinguishes between two types of LTA data: Structured data: Data output of an instrument or of a climate model run consists of numerous, highly structured individual datasets in a uniform format. Part of these data is also published on an ESGF (Earth System Grid Federation) data node. Detailed metadata is available allowing for fine-grained user-defined data access. Unstructured data: LTA data of finished scientific projects are in general unstructured and consist of datasets of different formats, different sizes, and different contents. For these data compact metadata is available as content information. The structured data is suitable for WDCC's DataCite DOI process, the project data only in exceptional cases. The DOI process includes a thorough quality control process of technical as well as scientific aspects by the publication agent and the data creator. DOIs are assigned to data collections appropriate to be cited in scientific publications, like a simulation run. The data collection is defined in agreement with the data creator. At the moment there is no possibility to identify and cite individual datasets within this DOI data collection analogous to the citation of chapters in a book. Also missing is a compact citation regulation for a user-specified collection of data. WDCC therefore complements its existing LTA/DOI concept by Persistent Identifier (PID) assignment to datasets using Handles. In addition to data identification for internal and external use, the concept of PIDs allows to define relations among PIDs. Such structural information is stored as key-value pair directly in the handles. Thus, relations provide basic provenance or lineage information, even if part of the data like intermediate results are lost. WDCC intends to use additional PIDs on metadata entities with a relation to the data PID(s). These add background information on the data creation process (e.g. descriptions of experiment, model, model set-up, and platform for the model run etc.) to the data. These pieces of additional information increase the re-usability of the archived model data, significantly. Other valuable additional information for scientific collaboration could be added by the same mechanism, like quality information and annotations. Apart from relations among data and metadata entities, PIDs on collections are advantageous for model data: Collections allow for persistent references to single datasets or subsets of data assigned a DOI, Data objects and additional information objects can be consistently connected via relations (provenance, creation, quality information for data),

  19. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control

    NASA Astrophysics Data System (ADS)

    He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.

  20. High speed, very large (8 megabyte) first in/first out buffer memory (FIFO)

    DOEpatents

    Baumbaugh, Alan E.; Knickerbocker, Kelly L.

    1989-01-01

    A fast FIFO (First In First Out) memory buffer capable of storing data at rates of 100 megabytes per second. The invention includes a data packer which concatenates small bit data words into large bit data words, a memory array having individual data storage addresses adapted to store the large bit data words, a data unpacker into which large bit data words from the array can be read and reconstructed into small bit data words, and a controller to control and keep track of the individual data storage addresses in the memory array into which data from the packer is being written and data to the unpacker is being read.

  1. Challenges and Best Practices for the Curation and Publication of Long-Tail Data with GFZ Data Services

    NASA Astrophysics Data System (ADS)

    Elger, Kirsten; Ulbricht, Damian; Bertelmann, Roland

    2017-04-01

    Open access to research data is an increasing international request and includes not only data underlying scholarly publication, but also raw and curated data. Especially in the framework of the observed shift in many scientific fields towards data science and data mining, data repositories are becoming important player as data archives and access point to curated research data. While general and institutional data repositories are available across all scientific disciplines, domain-specific data repositories are specialised for scientific disciplines, like, e.g., bio- or geosciences, with the possibility to use more discipline-specific and richer metadata models than general repositories. Data publication is increasingly regarded as important scientific achievement, and datasets with digital object identifier (DOI) are now fully citable in journal articles. Moreover, following in their signature of the "Statement of Commitment of the Coalition on Publishing Data in the Earth and Space Sciences" (COPDESS), many publishers have adopted their data policies and recommend and even request to store and publish data underlying scholarly publications in (domain-specific) data repositories and not as classical supplementary material directly attached to the respective article. The curation of large dynamic data from global networks in, e.g., seismology, magnetics or geodesy, always required a high grade of professional, IT-supported data management, simply to be able to store and access the huge number of files and manage dynamic datasets. In contrast to these, the vast amount of research data acquired by individual investigators or small teams known as 'long-tail data' was often not the focus for the development of data curation infrastructures. Nevertheless, even though they are small in size and highly variable, in total they represent a significant portion of the total scientific outcome. The curation of long-tail data requires more individual approaches and personal involvement of the data curator, especially regarding the data description. Here we will introduce best practices for the publication of long-tail data that are helping to reduce the individual effort, improve the quality of the data description. The data repository of GFZ Data Services, which is hosted at GFZ German Research Centre for Geosciences in Potsdam, is a domain-specific data repository for geosciences. In addition to large dynamic datasets from different disciplines, it has a large focus on the DOI-referenced publication of long-tail data with the aim to reach a high grade of reusability through a comprehensive data description and in the same time provide and distribute standardised, machine actionable metadata for data discovery (FAIR data). The development of templates for data reports, metadata provision by scientists via an XML Metadata Editor and discipline-specific DOI landing pages are helping both, the data curators to handle all kinds of datasets and enabling the scientists, i.e. user, to quickly decide whether a published dataset is fulfilling their needs. In addition, GFZ Data Services have developed DOI-registration services for several international networks (e.g. ICGEM, World Stress Map, IGETS, etc.). In addition, we have developed project-or network-specific designs of the DOI landing pages with the logo or design of the networks or project

  2. Opening Data in the Long Tail for Community Discovery, Curation and Action Using Active and Social Curation

    NASA Astrophysics Data System (ADS)

    Hedstrom, M. L.; Kumar, P.; Myers, J.; Plale, B. A.

    2012-12-01

    In data science, the most common sequence of steps for data curation are to 1) curate data, 2) enable data discovery, and 3) provide for data reuse. The Sustainable Environments - Actionable Data (SEAD) project, funded through NSF's DataNet program, is creating an environment for sustainability scientists to discover data first, reuse data next, and curate data though an on-going process that we call Active and Social Curation. For active curation we are developing tools and services that support data discovery, data management, and data enhancement for the community while the data is still being used actively for research. We are creating an Active Content Repository, using drop box, semantic web technologies, and a Flickr-like interface for researchers to "drop" data into a repository where it will be replicated and minimally discoverable. For social curation, we are deploying a social networking tool, VIVO, which will allow researchers to discover data-publications-people (e.g. expertise) through a route that can start at any of those entry points. The other dimension of social curation is developing mechanisms to open data for community input, for example, using ranking and commenting mechanisms for data sets and a community-sourcing capability to add tags, clean up and validate data sets. SEAD's strategies and services are aimed at the sustainability science community, which faces numerous challenges including discovery of useful data, cleaning noisy observational data, synthesizing data of different types, defining appropriate models, managing and preserving their research data, and conveying holistic results to colleagues, students, decision makers, and the public. Sustainability researchers make significant use of centrally managed data from satellites and national sensor networks, national scientific and statistical agencies, and data archives. At the same time, locally collected data and custom derived data products that combine observations and measurements from local, national, and global sources are critical resources that have disproportionately high value relative to their size. Sustainability science includes a diverse and growing community of domain scientists, policy makers, private sector investors, green manufacturers, citizen scientists, and informed consumers. These communities need actionable data in order to assess the impacts of alternate scenarios, evaluate the cost-benefit tradeoffs of different solutions, and defend their recommendations and decisions. SEAD's goal is to extend its services to other communities in the "long tail" that may benefit from new approaches to infrastructure development which take into account the social and economic characteristics of diverse and dispersed data producers and consumers. For example, one barrier to data reuse is the difficulty of discovering data that might be valuable for a particular study, model, or decision. Making data minimally discoverable saves the community time expended on futile searches and creates a market, of sorts, for the data. Creating very low barriers to entry to a network where data can be discovered and acted upon vastly reduces this disincentive to sharing data. SEAD's approach allows communities to make small incremental improvements in data curation based on their own priorities and needs.

  3. Curating and Integrating Data from Multiple Sources to Support Healthcare Analytics.

    PubMed

    Ng, Kenney; Kakkanatt, Chris; Benigno, Michael; Thompson, Clay; Jackson, Margaret; Cahan, Amos; Zhu, Xinxin; Zhang, Ping; Huang, Paul

    2015-01-01

    As the volume and variety of healthcare related data continues to grow, the analysis and use of this data will increasingly depend on the ability to appropriately collect, curate and integrate disparate data from many different sources. We describe our approach to and highlight our experiences with the development of a robust data collection, curation and integration infrastructure that supports healthcare analytics. This system has been successfully applied to the processing of a variety of data types including clinical data from electronic health records and observational studies, genomic data, microbiomic data, self-reported data from surveys and self-tracked data from wearable devices from over 600 subjects. The curated data is currently being used to support healthcare analytic applications such as data visualization, patient stratification and predictive modeling.

  4. Accessing memory

    DOEpatents

    Yoon, Doe Hyun; Muralimanohar, Naveen; Chang, Jichuan; Ranganthan, Parthasarathy

    2017-09-26

    A disclosed example method involves performing simultaneous data accesses on at least first and second independently selectable logical sub-ranks to access first data via a wide internal data bus in a memory device. The memory device includes a translation buffer chip, memory chips in independently selectable logical sub-ranks, a narrow external data bus to connect the translation buffer chip to a memory controller, and the wide internal data bus between the translation buffer chip and the memory chips. A data access is performed on only the first independently selectable logical sub-rank to access second data via the wide internal data bus. The example method also involves locating a first portion of the first data, a second portion of the first data, and the second data on the narrow external data bus during separate data transfers.

  5. Methods and Apparatus for Aggregation of Multiple Pulse Code Modulation Channels into a Signal Time Division Multiplexing Stream

    NASA Technical Reports Server (NTRS)

    Chang, Chen J. (Inventor); Liaghati, Jr., Amir L. (Inventor); Liaghati, Mahsa L. (Inventor)

    2018-01-01

    Methods and apparatus are provided for telemetry processing using a telemetry processor. The telemetry processor can include a plurality of communications interfaces, a computer processor, and data storage. The telemetry processor can buffer sensor data by: receiving a frame of sensor data using a first communications interface and clock data using a second communications interface, receiving an end of frame signal using a third communications interface, and storing the received frame of sensor data in the data storage. After buffering the sensor data, the telemetry processor can generate an encapsulated data packet including a single encapsulated data packet header, the buffered sensor data, and identifiers identifying telemetry devices that provided the sensor data. A format of the encapsulated data packet can comply with a Consultative Committee for Space Data Systems (CCSDS) standard. The telemetry processor can send the encapsulated data packet using a fourth and a fifth communications interfaces.

  6. A Data Quality Filter for PMU Measurements: Description, Experience, and Examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, James D.; Amidan, Brett G.

    Networks of phasor measurement units (PMUs) continue to grow, and along with them, the amount of data available for analysis. With so much data, it is impractical to identify and remove poor quality data manually. The data quality filter described in this paper was developed for use with the Data Integrity and Situation Awareness Tool (DISAT), which analyzes PMU data to identify anomalous system behavior. The filter operates based only on the information included in the data files, without supervisory control and data acquisition (SCADA) data, state estimator values, or system topology information. Measurements are compared to preselected thresholds tomore » determine if they are reliable. Along with the filter's description, examples of data quality issues from application of the filter to nine months of archived PMU data are provided. The paper is intended to aid the reader in recognizing and properly addressing data quality issues in PMU data.« less

  7. Managing data from multiple disciplines, scales, and sites to support synthesis and modeling

    USGS Publications Warehouse

    Olson, R. J.; Briggs, J. M.; Porter, J.H.; Mah, Grant R.; Stafford, S.G.

    1999-01-01

    The synthesis and modeling of ecological processes at multiple spatial and temporal scales involves bringing together and sharing data from numerous sources. This article describes a data and information system model that facilitates assembling, managing, and sharing diverse data from multiple disciplines, scales, and sites to support integrated ecological studies. Cross-site scientific-domain working groups coordinate the development of data associated with their particular scientific working group, including decisions about data requirements, data to be compiled, data formats, derived data products, and schedules across the sites. The Web-based data and information system consists of nodes for each working group plus a central node that provides data access, project information, data query, and other functionality. The approach incorporates scientists and computer experts in the working groups and provides incentives for individuals to submit documented data to the data and information system.

  8. Packaging and distributing ecological data from multisite studies

    NASA Technical Reports Server (NTRS)

    Olson, R. J.; Voorhees, L. D.; Field, J. M.; Gentry, M. J.

    1996-01-01

    Studies of global change and other regional issues depend on ecological data collected at multiple study areas or sites. An information system model is proposed for compiling diverse data from dispersed sources so that the data are consistent, complete, and readily available. The model includes investigators who collect and analyze field measurements, science teams that synthesize data, a project information system that collates data, a data archive center that distributes data to secondary users, and a master data directory that provides broader searching opportunities. Special attention to format consistency is required, such as units of measure, spatial coordinates, dates, and notation for missing values. Often data may need to be enhanced by estimating missing values, aggregating to common temporal units, or adding other related data such as climatic and soils data. Full documentation, an efficient data distribution mechanism, and an equitable way to acknowledge the original source of data are also required.

  9. Parallel compression of data chunks of a shared data object using a log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less

  10. Rdesign: A data dictionary with relational database design capabilities in Ada

    NASA Technical Reports Server (NTRS)

    Lekkos, Anthony A.; Kwok, Teresa Ting-Yin

    1986-01-01

    Data Dictionary is defined to be the set of all data attributes, which describe data objects in terms of their intrinsic attributes, such as name, type, size, format and definition. It is recognized as the data base for the Information Resource Management, to facilitate understanding and communication about the relationship between systems applications and systems data usage and to help assist in achieving data independence by permitting systems applications to access data knowledge of the location or storage characteristics of the data in the system. A research and development effort to use Ada has produced a data dictionary with data base design capabilities. This project supports data specification and analysis and offers a choice of the relational, network, and hierarchical model for logical data based design. It provides a highly integrated set of analysis and design transformation tools which range from templates for data element definition, spreadsheet for defining functional dependencies, normalization, to logical design generator.

  11. Uniform Data Access Using GXD

    NASA Technical Reports Server (NTRS)

    Vanderbilt, Peter

    1999-01-01

    This paper gives an overview of GXD, a framework facilitating publication and use of data from diverse data sources. GXD defines an object-oriented data model designed to represent a wide range of things including data, its metadata, resources and query results. GXD also defines a data transport language. a dialect of XML, for representing instances of the data model. This language allows for a wide range of data source implementations by supporting both the direct incorporation of data and the specification of data by various rules. The GXD software library, proto-typed in Java, includes client and server runtimes. The server runtime facilitates the generation of entities containing data encoded in the GXD transport language. The GXD client runtime interprets these entities (potentially from many data sources) to create an illusion of a globally interconnected data space, one that is independent of data source location and implementation.

  12. Data management support for selected climate data sets using the climate data access system

    NASA Technical Reports Server (NTRS)

    Reph, M. G.

    1983-01-01

    The functional capabilities of the Goddard Space Flight Center (GSFC) Climate Data Access System (CDAS), an interactive data storage and retrieval system, and the archival data sets which this system manages are discussed. The CDAS manages several climate-related data sets, such as the First Global Atmospheric Research Program (GARP) Global Experiment (FGGE) Level 2-b and Level 3-a data tapes. CDAS data management support consists of three basic functions: (1) an inventory capability which allows users to search or update a disk-resident inventory describing the contents of each tape in a data set, (2) a capability to depict graphically the spatial coverage of a tape in a data set, and (3) a data set selection capability which allows users to extract portions of a data set using criteria such as time, location, and data source/parameter and output the data to tape, user terminal, or system printer. This report includes figures that illustrate menu displays and output listings for each CDAS function.

  13. Sharing Health Big Data for Research - A Design by Use Cases: The INSHARE Platform Approach.

    PubMed

    Bouzillé, Guillaume; Westerlynck, Richard; Defossez, Gautier; Bouslimi, Dalel; Bayat, Sahar; Riou, Christine; Busnel, Yann; Le Guillou, Clara; Cauvin, Jean-Michel; Jacquelinet, Christian; Pladys, Patrick; Oger, Emmanuel; Stindel, Eric; Ingrand, Pierre; Coatrieux, Gouenou; Cuggia, Marc

    2017-01-01

    Sharing and exploiting Health Big Data (HBD) allow tackling challenges: data protection/governance taking into account legal, ethical, and deontological aspects enables trust, transparent and win-win relationship between researchers, citizens, and data providers. Lack of interoperability: compartmentalized and syntactically/semantica heterogeneous data. INSHARE project using experimental proof of concept explores how recent technologies overcome such issues. Using 6 data providers, platform is designed via 3 steps to: (1) analyze use cases, needs, and requirements; (2) define data sharing governance, secure access to platform; and (3) define platform specifications. Three use cases - from 5 studies and 11 data sources - were analyzed for platform design. Governance derived from SCANNER model was adapted to data sharing. Platform architecture integrates: data repository and hosting, semantic integration services, data processing, aggregate computing, data quality and integrity monitoring, Id linking, multisource query builder, visualization and data export services, data governance, study management service and security including data watermarking.

  14. Data publication with the structural biology data grid supports live analysis

    DOE PAGES

    Meyer, Peter A.; Socias, Stephanie; Key, Jason; ...

    2016-03-07

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of themore » original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. In conclusion, it is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis.« less

  15. Data publication with the structural biology data grid supports live analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Peter A.; Socias, Stephanie; Key, Jason

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of themore » original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. In conclusion, it is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis.« less

  16. [Accuracy improvement of spectral classification of crop using microwave backscatter data].

    PubMed

    Jia, Kun; Li, Qiang-Zi; Tian, Yi-Chen; Wu, Bing-Fang; Zhang, Fei-Fei; Meng, Ji-Hua

    2011-02-01

    In the present study, VV polarization microwave backscatter data used for improving accuracies of spectral classification of crop is investigated. Classification accuracy using different classifiers based on the fusion data of HJ satellite multi-spectral and Envisat ASAR VV backscatter data are compared. The results indicate that fusion data can take full advantage of spectral information of HJ multi-spectral data and the structure sensitivity feature of ASAR VV polarization data. The fusion data enlarges the spectral difference among different classifications and improves crop classification accuracy. The classification accuracy using fusion data can be increased by 5 percent compared to the single HJ data. Furthermore, ASAR VV polarization data is sensitive to non-agrarian area of planted field, and VV polarization data joined classification can effectively distinguish the field border. VV polarization data associating with multi-spectral data used in crop classification enlarges the application of satellite data and has the potential of spread in the domain of agriculture.

  17. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  18. An interdisciplinary analysis of multispectral satellite data for selected cover types in the Colorado Mountains, using automatic data processing techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1975-01-01

    The author has reported the following significant results. A data set containing SKYLAB, LANDSAT, and topographic data has been overlayed, registered, and geometrically corrected to a scale of 1:24,000. After geometrically correcting both sets of data, the SKYLAB data were overlayed on the LANDSAT data. Digital topographic data were then obtained, reformatted, and a data channel containing elevation information was then digitally overlayed onto the LANDSAT and SKYLAB spectral data. The 14,039 square kilometers involving 2,113, 776 LANDSAT pixels represents a relatively large data set available for digital analysis. The overlayed data set enables investigators to numerically analyze and compare two sources of spectral data and topographic data from any point in the scene. This capability is new and it will permit a numerical comparison of spectral response with elevation, slope, and aspect. Utilization of the spectral and topographic data together to obtain more accurate classifications of the various cover types present is feasible.

  19. Data publication with the structural biology data grid supports live analysis.

    PubMed

    Meyer, Peter A; Socias, Stephanie; Key, Jason; Ransey, Elizabeth; Tjon, Emily C; Buschiazzo, Alejandro; Lei, Ming; Botka, Chris; Withrow, James; Neau, David; Rajashankar, Kanagalaghatta; Anderson, Karen S; Baxter, Richard H; Blacklow, Stephen C; Boggon, Titus J; Bonvin, Alexandre M J J; Borek, Dominika; Brett, Tom J; Caflisch, Amedeo; Chang, Chung-I; Chazin, Walter J; Corbett, Kevin D; Cosgrove, Michael S; Crosson, Sean; Dhe-Paganon, Sirano; Di Cera, Enrico; Drennan, Catherine L; Eck, Michael J; Eichman, Brandt F; Fan, Qing R; Ferré-D'Amaré, Adrian R; Fromme, J Christopher; Garcia, K Christopher; Gaudet, Rachelle; Gong, Peng; Harrison, Stephen C; Heldwein, Ekaterina E; Jia, Zongchao; Keenan, Robert J; Kruse, Andrew C; Kvansakul, Marc; McLellan, Jason S; Modis, Yorgo; Nam, Yunsun; Otwinowski, Zbyszek; Pai, Emil F; Pereira, Pedro José Barbosa; Petosa, Carlo; Raman, C S; Rapoport, Tom A; Roll-Mecak, Antonina; Rosen, Michael K; Rudenko, Gabby; Schlessinger, Joseph; Schwartz, Thomas U; Shamoo, Yousif; Sondermann, Holger; Tao, Yizhi J; Tolia, Niraj H; Tsodikov, Oleg V; Westover, Kenneth D; Wu, Hao; Foster, Ian; Fraser, James S; Maia, Filipe R N C; Gonen, Tamir; Kirchhausen, Tom; Diederichs, Kay; Crosas, Mercè; Sliz, Piotr

    2016-03-07

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of the original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. It is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis.

  20. Data publication with the structural biology data grid supports live analysis

    PubMed Central

    Meyer, Peter A.; Socias, Stephanie; Key, Jason; Ransey, Elizabeth; Tjon, Emily C.; Buschiazzo, Alejandro; Lei, Ming; Botka, Chris; Withrow, James; Neau, David; Rajashankar, Kanagalaghatta; Anderson, Karen S.; Baxter, Richard H.; Blacklow, Stephen C.; Boggon, Titus J.; Bonvin, Alexandre M. J. J.; Borek, Dominika; Brett, Tom J.; Caflisch, Amedeo; Chang, Chung-I; Chazin, Walter J.; Corbett, Kevin D.; Cosgrove, Michael S.; Crosson, Sean; Dhe-Paganon, Sirano; Di Cera, Enrico; Drennan, Catherine L.; Eck, Michael J.; Eichman, Brandt F.; Fan, Qing R.; Ferré-D'Amaré, Adrian R.; Christopher Fromme, J.; Garcia, K. Christopher; Gaudet, Rachelle; Gong, Peng; Harrison, Stephen C.; Heldwein, Ekaterina E.; Jia, Zongchao; Keenan, Robert J.; Kruse, Andrew C.; Kvansakul, Marc; McLellan, Jason S.; Modis, Yorgo; Nam, Yunsun; Otwinowski, Zbyszek; Pai, Emil F.; Pereira, Pedro José Barbosa; Petosa, Carlo; Raman, C. S.; Rapoport, Tom A.; Roll-Mecak, Antonina; Rosen, Michael K.; Rudenko, Gabby; Schlessinger, Joseph; Schwartz, Thomas U.; Shamoo, Yousif; Sondermann, Holger; Tao, Yizhi J.; Tolia, Niraj H.; Tsodikov, Oleg V.; Westover, Kenneth D.; Wu, Hao; Foster, Ian; Fraser, James S.; Maia, Filipe R. N C.; Gonen, Tamir; Kirchhausen, Tom; Diederichs, Kay; Crosas, Mercè; Sliz, Piotr

    2016-01-01

    Access to experimental X-ray diffraction image data is fundamental for validation and reproduction of macromolecular models and indispensable for development of structural biology processing methods. Here, we established a diffraction data publication and dissemination system, Structural Biology Data Grid (SBDG; data.sbgrid.org), to preserve primary experimental data sets that support scientific publications. Data sets are accessible to researchers through a community driven data grid, which facilitates global data access. Our analysis of a pilot collection of crystallographic data sets demonstrates that the information archived by SBDG is sufficient to reprocess data to statistics that meet or exceed the quality of the original published structures. SBDG has extended its services to the entire community and is used to develop support for other types of biomedical data sets. It is anticipated that access to the experimental data sets will enhance the paradigm shift in the community towards a much more dynamic body of continuously improving data analysis. PMID:26947396

  1. The Lure of Statistics in Data Mining

    ERIC Educational Resources Information Center

    Grover, Lovleen Kumar; Mehra, Rajni

    2008-01-01

    The field of Data Mining like Statistics concerns itself with "learning from data" or "turning data into information". For statisticians the term "Data mining" has a pejorative meaning. Instead of finding useful patterns in large volumes of data as in the case of Statistics, data mining has the connotation of searching for data to fit preconceived…

  2. 14 CFR 121.346 - Flight data recorders: filtered data.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Flight data recorders: filtered data. 121... § 121.346 Flight data recorders: filtered data. (a) A flight data signal is filtered when an original... sensor signal value can be reconstructed from the recorded data. This demonstration requires that: (i...

  3. 14 CFR 135.156 - Flight data recorders: filtered data.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Flight data recorders: filtered data. 135... Aircraft and Equipment § 135.156 Flight data recorders: filtered data. (a) A flight data signal is filtered... original sensor signal value can be reconstructed from the recorded data. This demonstration requires that...

  4. Format conversion between CAD data and GIS data based on ArcGIS

    NASA Astrophysics Data System (ADS)

    Xie, Qingqing; Wei, Bo; Zhang, Kailin; Wang, Zhichao

    2015-12-01

    To make full use of the data resources and realize a sharing for the different types of data in different industries, a method of format conversion between CAD data and GIS data based on ArcGIS was proposed. To keep the integrity of the converted data, some key steps to process CAD data before conversion were made in AutoCAD. For examples, deleting unnecessary elements such as title, border and legend avoided the appearance of unnecessary elements after conversion, as layering data again by a national standard avoided the different types of elements to appear in a same layer after conversion. In ArcGIS, converting CAD data to GIS data was executed by the correspondence of graphic element classification between AutoCAD and ArcGIS. In addition, an empty geographic database and feature set was required to create in ArcGIS for storing the text data of CAD data. The experimental results show that the proposed method avoids a large amount of editing work in data conversion and maintains the integrity of spatial data and attribute data between before and after conversion.

  5. Authoring Data-Driven Videos with DataClips.

    PubMed

    Amini, Fereshteh; Riche, Nathalie Henry; Lee, Bongshin; Monroy-Hernandez, Andres; Irani, Pourang

    2017-01-01

    Data videos, or short data-driven motion graphics, are an increasingly popular medium for storytelling. However, creating data videos is difficult as it involves pulling together a unique combination of skills. We introduce DataClips, an authoring tool aimed at lowering the barriers to crafting data videos. DataClips allows non-experts to assemble data-driven "clips" together to form longer sequences. We constructed the library of data clips by analyzing the composition of over 70 data videos produced by reputable sources such as The New York Times and The Guardian. We demonstrate that DataClips can reproduce over 90% of our data videos corpus. We also report on a qualitative study comparing the authoring process and outcome achieved by (1) non-experts using DataClips, and (2) experts using Adobe Illustrator and After Effects to create data-driven clips. Results indicated that non-experts are able to learn and use DataClips with a short training period. In the span of one hour, they were able to produce more videos than experts using a professional editing tool, and their clips were rated similarly by an independent audience.

  6. The Role of NOAA's National Data Centers in the Earth and Space Science Infrastructure

    NASA Astrophysics Data System (ADS)

    Fox, C. G.

    2008-12-01

    NOAA's National Data Centers (NNDC) provide access to long-term archives of environmental data from NOAA and other sources. The NNDCs face significant challenges in the volume and complexity of modern data sets. Data volume challenges are being addressed using more capable data archive systems such as the Comprehensive Large Array-Data Stewardship System (CLASS). Challenges in assuring data quality and stewardship are in many ways more challenging. In the past, scientists at the Data Centers could provide reasonable stewardship of data sets in their area of expertise. As staff levels have decreased and data complexity has increased, Data Centers depend on their data providers and user communities to provide high-quality metadata, feedback on data problems and improvements. This relationship requires strong partnerships between the NNDCs and academic, commercial, and international partners, as well as advanced data management and access tools that conform to established international standards when available. The NNDCs are looking to geospatial databases, interactive mapping, web services, and other Application Program Interface approaches to help preserve NNDC data and information and to make it easily available to the scientific community.

  7. Method of and apparatus for generating an interstitial point in a data stream having an even number of data points

    NASA Technical Reports Server (NTRS)

    Edwards, T. R. (Inventor)

    1985-01-01

    Apparatus for doubling the data density rate of an analog to digital converter or doubling the data density storage capacity of a memory deviced is discussed. An interstitial data point midway between adjacent data points in a data stream having an even number of equal interval data points is generated by applying a set of predetermined one-dimensional convolute integer coefficients which can include a set of multiplier coefficients and a normalizer coefficient. Interpolator means apply the coefficients to the data points by weighting equally on each side of the center of the even number of equal interval data points to obtain an interstital point value at the center of the data points. A one-dimensional output data set, which is twice as dense as a one-dimensional equal interval input data set, can be generated where the output data set includes interstitial points interdigitated between adjacent data points in the input data set. The method for generating the set of interstital points is a weighted, nearest-neighbor, non-recursive, moving, smoothing averaging technique, equivalent to applying a polynomial regression calculation to the data set.

  8. FRIEDA: Flexible Robust Intelligent Elastic Data Management Framework

    DOE PAGES

    Ghoshal, Devarshi; Hendrix, Valerie; Fox, William; ...

    2017-02-01

    Scientific applications are increasingly using cloud resources for their data analysis workflows. However, managing data effectively and efficiently over these cloud resources is challenging due to the myriad storage choices with different performance, cost trade-offs, complex application choices and complexity associated with elasticity, failure rates in these environments. The different data access patterns for data-intensive scientific applications require a more flexible and robust data management solution than the ones currently in existence. FRIEDA is a Flexible Robust Intelligent Elastic Data Management framework that employs a range of data management strategies in cloud environments. FRIEDA can manage storage and data lifecyclemore » of applications in cloud environments. There are four different stages in the data management lifecycle of FRIEDA – (i) storage planning, (ii) provisioning and preparation, (iii) data placement, and (iv) execution. FRIEDA defines a data control plane and an execution plane. The data control plane defines the data partition and distribution strategy, whereas the execution plane manages the execution of the application using a master-worker paradigm. FRIEDA also provides different data management strategies, either to partition the data in real-time, or predetermine the data partitions prior to application execution.« less

  9. Nonlinear analysis of EEG for epileptic seizures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hively, L.M.; Clapp, N.E.; Daw, C.S.

    1995-04-01

    We apply chaotic time series analysis (CTSA) to human electroencephalogram (EEG) data. Three epoches were examined: epileptic seizure, non-seizure, and transition from non-seizure to seizure. The CTSA tools were applied to four forms of these data: raw EEG data (e-data), artifact data (f-data) via application of a quadratic zero-phase filter of the raw data, artifact-filtered data (g- data) and that was the residual after subtracting f-data from e-data, and a low-pass-filtered version (h-data) of g-data. Two different seizures were analyzed for the same patient. Several nonlinear measures uniquely indicate an epileptic seizure in both cases, including an abrupt decrease inmore » the time per wave cycle in f-data, an abrupt increase in the Kolmogorov entropy and in the correlation dimension for e-h data, and an abrupt increase in the correlation dimension for e-h data. The transition from normal to seizure state also is characterized by distinctly different trends in the nonlinear measures for each seizure and may be potential seizure predictors for this patient. Surrogate analysis of e-data shows that statistically significant nonlinear structure is present during the non-seizure, transition , and seizure epoches.« less

  10. FRIEDA: Flexible Robust Intelligent Elastic Data Management Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghoshal, Devarshi; Hendrix, Valerie; Fox, William

    Scientific applications are increasingly using cloud resources for their data analysis workflows. However, managing data effectively and efficiently over these cloud resources is challenging due to the myriad storage choices with different performance, cost trade-offs, complex application choices and complexity associated with elasticity, failure rates in these environments. The different data access patterns for data-intensive scientific applications require a more flexible and robust data management solution than the ones currently in existence. FRIEDA is a Flexible Robust Intelligent Elastic Data Management framework that employs a range of data management strategies in cloud environments. FRIEDA can manage storage and data lifecyclemore » of applications in cloud environments. There are four different stages in the data management lifecycle of FRIEDA – (i) storage planning, (ii) provisioning and preparation, (iii) data placement, and (iv) execution. FRIEDA defines a data control plane and an execution plane. The data control plane defines the data partition and distribution strategy, whereas the execution plane manages the execution of the application using a master-worker paradigm. FRIEDA also provides different data management strategies, either to partition the data in real-time, or predetermine the data partitions prior to application execution.« less

  11. Modern data science for analytical chemical data - A comprehensive review.

    PubMed

    Szymańska, Ewa

    2018-10-22

    Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Individual Data Linkage of Survey Data with Claims Data in Germany—An Overview Based on a Cohort Study

    PubMed Central

    March, Stefanie

    2017-01-01

    Research based on health insurance data has a long tradition in Germany. By contrast, data linkage of survey data with such claims data is a relatively new field of research with high potential. Data linkage opens up new opportunities for analyses in the field of health services research and public health. Germany has comprehensive rules and regulations of data protection that have to be followed. Therefore, a written informed consent is needed for individual data linkage. Additionally, the health system is characterized by heterogeneity of health insurance. The lidA-living at work-study is a cohort study on work, age and health, which linked survey data with claims data of a large number of statutory health insurance data. All health insurance funds were contacted, of whom a written consent was given. This paper will give an overview of individual data linkage of survey data with German claims data on the example of the lidA-study results. The challenges and limitations of data linkage will be presented. Despite heterogeneity, such kind of studies is possible with a negligibly small influence of bias. The experience we gain in lidA will be shown and provide important insights for other studies focusing on data linkage. PMID:29232834

  13. Individual Data Linkage of Survey Data with Claims Data in Germany-An Overview Based on a Cohort Study.

    PubMed

    March, Stefanie

    2017-12-09

    Research based on health insurance data has a long tradition in Germany. By contrast, data linkage of survey data with such claims data is a relatively new field of research with high potential. Data linkage opens up new opportunities for analyses in the field of health services research and public health. Germany has comprehensive rules and regulations of data protection that have to be followed. Therefore, a written informed consent is needed for individual data linkage. Additionally, the health system is characterized by heterogeneity of health insurance. The lidA-living at work-study is a cohort study on work, age and health, which linked survey data with claims data of a large number of statutory health insurance data. All health insurance funds were contacted, of whom a written consent was given. This paper will give an overview of individual data linkage of survey data with German claims data on the example of the lidA-study results. The challenges and limitations of data linkage will be presented. Despite heterogeneity, such kind of studies is possible with a negligibly small influence of bias. The experience we gain in lidA will be shown and provide important insights for other studies focusing on data linkage.

  14. Simple, Script-Based Science Processing Archive

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Hegde, Mahabaleshwara; Barth, C. Wrandle

    2007-01-01

    The Simple, Scalable, Script-based Science Processing (S4P) Archive (S4PA) is a disk-based archival system for remote sensing data. It is based on the data-driven framework of S4P and is used for data transfer, data preprocessing, metadata generation, data archive, and data distribution. New data are automatically detected by the system. S4P provides services such as data access control, data subscription, metadata publication, data replication, and data recovery. It comprises scripts that control the data flow. The system detects the availability of data on an FTP (file transfer protocol) server, initiates data transfer, preprocesses data if necessary, and archives it on readily available disk drives with FTP and HTTP (Hypertext Transfer Protocol) access, allowing instantaneous data access. There are options for plug-ins for data preprocessing before storage. Publication of metadata to external applications such as the Earth Observing System Clearinghouse (ECHO) is also supported. S4PA includes a graphical user interface for monitoring the system operation and a tool for deploying the system. To ensure reliability, S4P continuously checks stored data for integrity, Further reliability is provided by tape backups of disks made once a disk partition is full and closed. The system is designed for low maintenance, requiring minimal operator oversight.

  15. Application and Prospect of Big Data in Water Resources

    NASA Astrophysics Data System (ADS)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  16. Integrated platform and API for electrophysiological data

    PubMed Central

    Sobolev, Andrey; Stoewer, Adrian; Leonhardt, Aljoscha; Rautenberg, Philipp L.; Kellner, Christian J.; Garbers, Christian; Wachtler, Thomas

    2014-01-01

    Recent advancements in technology and methodology have led to growing amounts of increasingly complex neuroscience data recorded from various species, modalities, and levels of study. The rapid data growth has made efficient data access and flexible, machine-readable data annotation a crucial requisite for neuroscientists. Clear and consistent annotation and organization of data is not only an important ingredient for reproducibility of results and re-use of data, but also essential for collaborative research and data sharing. In particular, efficient data management and interoperability requires a unified approach that integrates data and metadata and provides a common way of accessing this information. In this paper we describe GNData, a data management platform for neurophysiological data. GNData provides a storage system based on a data representation that is suitable to organize data and metadata from any electrophysiological experiment, with a functionality exposed via a common application programming interface (API). Data representation and API structure are compatible with existing approaches for data and metadata representation in neurophysiology. The API implementation is based on the Representational State Transfer (REST) pattern, which enables data access integration in software applications and facilitates the development of tools that communicate with the service. Client libraries that interact with the API provide direct data access from computing environments like Matlab or Python, enabling integration of data management into the scientist's experimental or analysis routines. PMID:24795616

  17. Comparison of property between two Viking Seismic tapes

    NASA Astrophysics Data System (ADS)

    Yamamoto, Y.; Yamada, R.

    2016-12-01

    Tthe restoration work of the seismometer data onboard Viking Lander 2 is still continuing. Originally, the data were processed and archived both in MIT and UTIG separately, and each data is accessible via the Internet today. Their file formats to store the data are different, but both of them are currently readable due to the continuous investigation. However, there is some inconsistency between their data although most of their data are highly consistent. To understand the differences, the knowledge of archiving and off-line processing of spacecraft is required because these differences are caused by the off-line processing.The data processing of spacecraft often requires merge and sort processing of raw data. The merge processing is normally performed to eliminate duplicated data, and the sort processing is performed to fix data order. UTIG did not seem to perform these merge and sort processing. Therefore, the UTIG processed data remain duplication. The MIT processed data did these merge and sort processing, but the raw data sometimes include wrong time tags, and it cannot be fixed strictly after sort processing. Also, the MIT processed data has enough documents to understand metadata, while UTIG data has a brief instruction. Therefore, both of MIT and UTIG data are treated complementary. A better data set can be established using both of them. In this presentation, we would show the method to build a better data set of Viking Lander 2 seismic data.

  18. Integrated platform and API for electrophysiological data.

    PubMed

    Sobolev, Andrey; Stoewer, Adrian; Leonhardt, Aljoscha; Rautenberg, Philipp L; Kellner, Christian J; Garbers, Christian; Wachtler, Thomas

    2014-01-01

    Recent advancements in technology and methodology have led to growing amounts of increasingly complex neuroscience data recorded from various species, modalities, and levels of study. The rapid data growth has made efficient data access and flexible, machine-readable data annotation a crucial requisite for neuroscientists. Clear and consistent annotation and organization of data is not only an important ingredient for reproducibility of results and re-use of data, but also essential for collaborative research and data sharing. In particular, efficient data management and interoperability requires a unified approach that integrates data and metadata and provides a common way of accessing this information. In this paper we describe GNData, a data management platform for neurophysiological data. GNData provides a storage system based on a data representation that is suitable to organize data and metadata from any electrophysiological experiment, with a functionality exposed via a common application programming interface (API). Data representation and API structure are compatible with existing approaches for data and metadata representation in neurophysiology. The API implementation is based on the Representational State Transfer (REST) pattern, which enables data access integration in software applications and facilitates the development of tools that communicate with the service. Client libraries that interact with the API provide direct data access from computing environments like Matlab or Python, enabling integration of data management into the scientist's experimental or analysis routines.

  19. SeaDataNet Pan-European infrastructure for Ocean & Marine Data Management

    NASA Astrophysics Data System (ADS)

    Manzella, G. M.; Maillard, C.; Maudire, G.; Schaap, D.; Rickards, L.; Nast, F.; Balopoulos, E.; Mikhailov, N.; Vladymyrov, V.; Pissierssens, P.; Schlitzer, R.; Beckers, J. M.; Barale, V.

    2007-12-01

    SEADATANET is developing a Pan-European data management infrastructure to insure access to a large number of marine environmental data (i.e. temperature, salinity current, sea level, chemical, physical and biological properties), safeguard and long term archiving. Data are derived from many different sensors installed on board of research vessels, satellite and the various platforms of the marine observing system. SeaDataNet allows to have information on real time and archived marine environmental data collected at a pan-european level, through directories on marine environmental data and projects. SeaDataNet allows the access to the most comprehensive multidisciplinary sets of marine in-situ and remote sensing data, from about 40 laboratories, through user friendly tools. The data selection and access is operated through the Common Data Index (CDI), XML files compliant with ISO standards and unified dictionaries. Technical Developments carried out by SeaDataNet includes: A library of Standards - Meta-data standards, compliant with ISO 19115, for communication and interoperability between the data platforms. Software of interoperable on line system - Interconnection of distributed data centres by interfacing adapted communication technology tools. Off-Line Data Management software - software representing the minimum equipment of all the data centres is developed by AWI "Ocean Data View (ODV)". Training, Education and Capacity Building - Training 'on the job' is carried out by IOC-Unesco in Ostende. SeaDataNet Virtual Educational Centre internet portal provides basic tools for informal education

  20. A scalable neuroinformatics data flow for electrophysiological signals using MapReduce.

    PubMed

    Jayapandian, Catherine; Wei, Annan; Ramesh, Priya; Zonjy, Bilal; Lhatoo, Samden D; Loparo, Kenneth; Zhang, Guo-Qiang; Sahoo, Satya S

    2015-01-01

    Data-driven neuroscience research is providing new insights in progression of neurological disorders and supporting the development of improved treatment approaches. However, the volume, velocity, and variety of neuroscience data generated from sophisticated recording instruments and acquisition methods have exacerbated the limited scalability of existing neuroinformatics tools. This makes it difficult for neuroscience researchers to effectively leverage the growing multi-modal neuroscience data to advance research in serious neurological disorders, such as epilepsy. We describe the development of the Cloudwave data flow that uses new data partitioning techniques to store and analyze electrophysiological signal in distributed computing infrastructure. The Cloudwave data flow uses MapReduce parallel programming algorithm to implement an integrated signal data processing pipeline that scales with large volume of data generated at high velocity. Using an epilepsy domain ontology together with an epilepsy focused extensible data representation format called Cloudwave Signal Format (CSF), the data flow addresses the challenge of data heterogeneity and is interoperable with existing neuroinformatics data representation formats, such as HDF5. The scalability of the Cloudwave data flow is evaluated using a 30-node cluster installed with the open source Hadoop software stack. The results demonstrate that the Cloudwave data flow can process increasing volume of signal data by leveraging Hadoop Data Nodes to reduce the total data processing time. The Cloudwave data flow is a template for developing highly scalable neuroscience data processing pipelines using MapReduce algorithms to support a variety of user applications.

  1. State-Level Immunization Information Systems: Potential for Childhood Immunization Data Linkages.

    PubMed

    Fuller, Jill E; Walter, Emmanuel B; Dole, Nancy; O'Hara, Richard; Herring, Amy H; Durkin, Maureen S; Specker, Bonny; Wey, Betty

    2017-01-01

    Objectives Sources of immunization data include state registries or immunization information systems (IIS), medical records, and surveys. Little is known about the quality of these data sources or the feasibility of using IIS data for research. We assessed the feasibility of collecting immunization information for a national children's health study by accessing existing IIS data and comparing the completeness of these data against medical record abstractions (MRA) and parent report. Staff time needed to obtain IIS and MRA data was assessed. Methods We administered a questionnaire to state-level IIS representatives to ascertain availability and completeness of their data for research and gather information about data formats. We evaluated quality of data from IIS, medical records, and reports from parents of 119 National Children's Study participants at three locations. Results IIS data were comparable to MRA data and both were more complete than parental report. Agreement between IIS and MRA data was greater than between parental report and MRA, suggesting IIS and MRA are better sources than parental report. Obtaining IIS data took less staff time than chart review, making IIS data linkage for research a preferred choice. Conclusions IIS survey results indicate data can be obtained by researchers using data linkages. IIS are an accessible and feasible child immunization information source and these registries reduce reliance on parental report or medical record abstraction. Researchers seeking to link IIS data with large multi-site studies should consider acquiring IIS data, but may need strategies to overcome barriers to data completeness and linkage.

  2. A scalable neuroinformatics data flow for electrophysiological signals using MapReduce

    PubMed Central

    Jayapandian, Catherine; Wei, Annan; Ramesh, Priya; Zonjy, Bilal; Lhatoo, Samden D.; Loparo, Kenneth; Zhang, Guo-Qiang; Sahoo, Satya S.

    2015-01-01

    Data-driven neuroscience research is providing new insights in progression of neurological disorders and supporting the development of improved treatment approaches. However, the volume, velocity, and variety of neuroscience data generated from sophisticated recording instruments and acquisition methods have exacerbated the limited scalability of existing neuroinformatics tools. This makes it difficult for neuroscience researchers to effectively leverage the growing multi-modal neuroscience data to advance research in serious neurological disorders, such as epilepsy. We describe the development of the Cloudwave data flow that uses new data partitioning techniques to store and analyze electrophysiological signal in distributed computing infrastructure. The Cloudwave data flow uses MapReduce parallel programming algorithm to implement an integrated signal data processing pipeline that scales with large volume of data generated at high velocity. Using an epilepsy domain ontology together with an epilepsy focused extensible data representation format called Cloudwave Signal Format (CSF), the data flow addresses the challenge of data heterogeneity and is interoperable with existing neuroinformatics data representation formats, such as HDF5. The scalability of the Cloudwave data flow is evaluated using a 30-node cluster installed with the open source Hadoop software stack. The results demonstrate that the Cloudwave data flow can process increasing volume of signal data by leveraging Hadoop Data Nodes to reduce the total data processing time. The Cloudwave data flow is a template for developing highly scalable neuroscience data processing pipelines using MapReduce algorithms to support a variety of user applications. PMID:25852536

  3. Data Elevator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BYNA, SUNRENDRA; DONG, BIN; WU, KESHENG

    Data Elevator: Efficient Asynchronous Data Movement in Hierarchical Storage Systems Multi-layer storage subsystems, including SSD-based burst buffers and disk-based parallel file systems (PFS), are becoming part of HPC systems. However, software for this storage hierarchy is still in its infancy. Applications may have to explicitly move data among the storage layers. We propose Data Elevator for transparently and efficiently moving data between a burst buffer and a PFS. Users specify the final destination for their data, typically on PFS, Data Elevator intercepts the I/O calls, stages data on burst buffer, and then asynchronously transfers the data to their final destinationmore » in the background. This system allows extensive optimizations, such as overlapping read and write operations, choosing I/O modes, and aligning buffer boundaries. In tests with large-scale scientific applications, Data Elevator is as much as 4.2X faster than Cray DataWarp, the start-of-art software for burst buffer, and 4X faster than directly writing to PFS. The Data Elevator library uses HDF5's Virtual Object Layer (VOL) for intercepting parallel I/O calls that write data to PFS. The intercepted calls are redirected to the Data Elevator, which provides a handle to write the file in a faster and intermediate burst buffer system. Once the application finishes writing the data to the burst buffer, the Data Elevator job uses HDF5 to move the data to final destination in an asynchronous manner. Hence, using the Data Elevator library is currently useful for applications that call HDF5 for writing data files. Also, the Data Elevator depends on the HDF5 VOL functionality.« less

  4. Supporting Snow Research: SnowEx Data and Services at the NASA National Snow and Ice Data Center DAAC

    NASA Astrophysics Data System (ADS)

    Leon, A.; Tanner, S.; Deems, J. S.

    2017-12-01

    The National Snow and Ice Data Center Distributed Active Archive Center (NSIDC DAAC), part of the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder, will archive and distribute all primary data sets collected during the NASA SnowEx campaigns. NSIDC DAAC's overarching goal for SnowEx data management is to steward the diverse SnowEx data sets to provide a reliable long-term archive, to enable effective data discovery, retrieval, and usage, and to support end user engagement. This goal will be achieved though coordination and collaboration with SnowEx project management and investigators. NSIDC DAAC's core functions for SnowEx data management include: Data Creation: Advise investigators on data formats and structure as well as metadata creation and content to enable preservation, usability, and discoverability. Data Documentation: Develop comprehensive data set documentation describing the instruments, data collection and derivation methods, and data file contents. Data Distribution: Provide discovery and access through NSIDC and NASA data portals to make SnowEx data available to a broad user community Data & User Support: Assist user communities with the selection and usage of SnowEx data products. In an effort to educate and broaden the SnowEx user community, we will present an overview of the SnowEx data products, tools, and services which will be available at the NSIDC DAAC. We hope to gain further insight into how the DAAC can enable the user community to seamlessly and effectively utilize SnowEx data in their research and applications.

  5. Opportunities and challenges in conducting secondary analysis of HIV programmes using data from routine health information systems and personal health information.

    PubMed

    Gloyd, Stephen; Wagenaar, Bradley H; Woelk, Godfrey B; Kalibala, Samuel

    2016-01-01

    HIV programme data from routine health information systems (RHIS) and personal health information (PHI) provide ample opportunities for secondary data analysis. However, these data pose unique opportunities and challenges for use in health system monitoring, along with process and impact evaluations. Analyses focused on retrospective case reviews of four of the HIV-related studies published in this JIAS supplement. We identify specific opportunities and challenges with respect to the secondary analysis of RHIS and PHI data. Challenges working with both HIV-related RHIS and PHI included missing, inconsistent and implausible data; rapidly changing indicators; systematic differences in the utilization of services; and patient linkages over time and different data sources. Specific challenges among RHIS data included numerous registries and indicators, inconsistent data entry, gaps in data transmission, duplicate registry of information, numerator-denominator incompatibility and infrequent use of data for decision-making. Challenges specific to PHI included the time burden for busy providers, the culture of lax charting, overflowing archives for paper charts and infrequent chart review. Many of the challenges that undermine effective use of RHIS and PHI data for analyses are related to the processes and context of collecting the data, excessive data requirements, lack of knowledge of the purpose of data and the limited use of data among those generating the data. Recommendations include simplifying data sources, analysis and reporting; conducting systematic data quality audits; enhancing the use of data for decision-making; promoting routine chart review linked with simple patient tracking systems; and encouraging open access to RHIS and PHI data for increased use.

  6. Data and Data Products for Climate Research: Web Services at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    DeCarlo, S.; Potemra, J. T.; Wang, K.

    2012-12-01

    The International Pacific Research Center (IPRC) at the University of Hawaii maintains a data center for climate studies called the Asia-Pacific Data-Research Center (APDRC). This data center was designed within a center of excellence in climate research with the intention of serving the needs of the research scientist. The APDRC provides easy access to a wide collection of climate data and data products for a wide variety of users. The data center maintains an archive of approximately 100 data sets including in-situ and remote data, as well as a range of model-based output. All data are available via on-line browsing tools such as a Live Access Server (LAS) and DChart, and direct binary access is available through OPeNDAP services. On-line tutorials on how to use these services are now available. Users can keep up-to-date with new data and product announcements via the APDRC facebook page. The main focus of the APDRC has been climate scientists, and the services are therefore streamlined to such users, both in the number and types of data served, but also in the way data are served. In addition, due to the integration of the APDRC within the IPRC, several value-added data products (see figure for an example using Argo floats) have been developed via a variety of research activities. The APDRC, therefore, has three main foci: 1. acquisition of climate-related data, 2. maintenance of integrated data servers, and 3. development and distribution of data products The APDRC can be found at http://apdrc.soest.hawaii.edu. The presentation will provide an overview along with specific examples of the data, data products and data services available at the APDRC.; APDRC product example: gridded field from Argo profiling floats

  7. Opening Health Data: What Do Researchers Want? Early Experiences With New York's Open Health Data Platform.

    PubMed

    Martin, Erika G; Helbig, Natalie; Birkhead, Guthrie S

    2015-01-01

    Governments are rapidly developing open data platforms to improve transparency and make information more accessible. New York is a leader, with currently the only state platform devoted to health. Although these platforms could build public health departments' capabilities to serve more researchers, agencies have little guidance on releasing meaningful and usable data. Structured focus groups with researchers and practitioners collected stakeholder feedback on potential uses of open health data and New York's open data strategy. Researchers and practitioners attended a 1-day November 2013 workshop on New York State's open health data resources. After learning about the state's open data platform and vision for open health data, participants were organized into 7 focus groups to discuss the essential elements of open data sets, practical challenges to obtaining and using health data, and potential uses of open data. Participants included 33 quantitative health researchers from State University of New York campuses and private partners and 10 practitioners from the New York State Department of Health. There was low awareness of open data, with 67% of researchers reporting never using open data portals prior to the workshop. Participants were interested in data sets that were geocoded, longitudinal, or aggregated to small area granularity and capabilities to link multiple data sets. Multiple environmental conditions and barriers hinder their capacity to use health data for research. Although open data platforms cannot address all barriers, they provide multiple opportunities for public health research and practice, and participants were overall positive about the state's efforts to release open data. Open data are not ideal for some researchers because they do not contain individually identifiable data, indicating a need for tiered data release strategies. However, they do provide important new opportunities to facilitate research and foster collaborations among agencies, researchers, and practitioners.

  8. Has open data arrived at the British Medical Journal (BMJ)? An observational study.

    PubMed

    Rowhani-Farid, Anisa; Barnett, Adrian G

    2016-10-13

    To quantify data sharing trends and data sharing policy compliance at the British Medical Journal (BMJ) by analysing the rate of data sharing practices, and investigate attitudes and examine barriers towards data sharing. Observational study. The BMJ research archive. 160 randomly sampled BMJ research articles from 2009 to 2015, excluding meta-analysis and systematic reviews. Percentages of research articles that indicated the availability of their raw data sets in their data sharing statements, and those that easily made their data sets available on request. 3 articles contained the data in the article. 50 out of 157 (32%) remaining articles indicated the availability of their data sets. 12 used publicly available data and the remaining 38 were sent email requests to access their data sets. Only 1 publicly available data set could be accessed and only 6 out of 38 shared their data via email. So only 7/157 research articles shared their data sets, 4.5% (95% CI 1.8% to 9%). For 21 clinical trials bound by the BMJ data sharing policy, the per cent shared was 24% (8% to 47%). Despite the BMJ's strong data sharing policy, sharing rates are low. Possible explanations for low data sharing rates could be: the wording of the BMJ data sharing policy, which leaves room for individual interpretation and possible loopholes; that our email requests ended up in researchers spam folders; and that researchers are not rewarded for sharing their data. It might be time for a more effective data sharing policy and better incentives for health and medical researchers to share their data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Enabling the Usability of Earth Science Data Products and Services by Evaluating, Describing, and Improving Data Quality throughout the Data Lifecycle

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Peng, G.; Wei, Y.; Ramapriyan, H.; Moroni, D. F.

    2015-12-01

    Earth science data products and services are being used by representatives of various science and social science disciplines, by planning and decision-making professionals, by educators and learners ranging from primary through graduate and informal education, and by the general public. The diversity of users and uses of Earth science data is gratifying and offers new challenges for enabling the usability of these data by audiences with various purposes and levels of expertise. Users and other stakeholders need capabilities to efficiently find, explore, select, and determine the applicability and suitability of data products and services to meet their objectives and information needs. Similarly, they need to be able to understand the limitations of Earth science data, which can be complex, especially when considering combined or simultaneous use of multiple data products and services. Quality control efforts of stakeholders, throughout the data lifecycle, can contribute to the usability of Earth science data to meet the needs of diverse users. Such stakeholders include study design teams, data producers, data managers and curators, archives, systems professionals, data distributors, end-users, intermediaries, sponsoring organizations, hosting institutions, and others. Opportunities for engaging stakeholders to review, describe, and improve the quality of Earth science data products and services throughout the data lifecycle are identified and discussed. Insight is shared from the development of guidelines for implementing the Group on Earth Observations (GEO) Data Management Principles, the recommendations from the Earth Science Data System Working Group (ESDSWG) on Data Quality, and the efforts of the Information Quality Cluster of the Federation of Earth Science Information Partners (ESIP). Examples and outcomes from quality control efforts of data facilities, such as scientific data centers, that contribute to the usability of Earth science data also are offered.

  10. Advancing User Supports with a Structured How-To Knowledge Base for Earth Science Data

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Acker, James G.; Lynnes, Christopher S.; Beaty, Tammy; Lighty, Luther; Kempler, Steven J.

    2016-01-01

    It is a challenge to access and process fast growing Earth science data from satellites and numerical models, which may be archived in very different data format and structures. NASA data centers, managed by the Earth Observing System Data and Information System (EOSDIS), have developed a rich and diverse set of data services and tools with features intended to simplify finding, downloading, and working with these data. Although most data services and tools have user guides, many users still experience difficulties with accessing or reading data due to varying levels of familiarity with data services, tools, and/or formats. A type of structured online document, data recipe, were created in beginning 2013 by Goddard Earth Science Data and Information Services Center (GES DISC). A data recipe is the How-To document created by using the fixed template, containing step-by-step instructions with screenshots and examples of accessing and working with real data. The recipes has been found to be very helpful, especially to first-time-users of particular data services, tools, or data products. Online traffic to the data recipe pages is significant to some recipes. In 2014, the NASA Earth Science Data System Working Group (ESDSWG) for data recipes was established, aimed to initiate an EOSDIS-wide campaign for leveraging the distributed knowledge within EOSDIS and its user communities regarding their respective services and tools. The ESDSWG data recipe group started with inventory and analysis of existing EOSDIS-wide online help documents, and provided recommendations and guidelines and for writing and grouping data recipes. This presentation will overview activities of creating How-To documents at GES DISC and ESDSWG. We encourage feedback and contribution from users for improving the data How-To knowledge base.

  11. Credentialing Data Scientists: A Domain Repository Perspective

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Furukawa, H.

    2015-12-01

    A career in data science can have many paths: data curation, data analysis, metadata modeling - all of these in different commercial or scientific applications. Can a certification as 'data scientist' provide the guarantee that an applicant or candidate for a data science position has just the right skills? How valuable is a 'generic' certification as data scientist for an employer looking to fill a data science position? Credentials that are more specific and discipline-oriented may be more valuable to both the employer and the job candidate. One employment sector for data scientists are the data repositories that provide discipline-specific data services for science communities. Data science positions within domain repositories include a wide range of responsibilities in support of the full data life cycle - from data preservation and curation to development of data models, ontologies, and user interfaces, to development of data analysis and visualization tools to community education and outreach, and require a substantial degree of discipline-specific knowledge of scientific data acquisition and analysis workflows, data quality measures, and data cultures. Can there be certification programs for domain-specific data scientists that help build the urgently needed workforce for the repositories? The American Geophysical Union has recently started an initiative to develop a program for data science continuing education and data science professional certification for the Earth and space sciences. An Editorial Board has been charged to identify and develop curricula and content for these programs and to provide input and feedback in the implementation of the program. This presentation will report on the progress of this initiative and evaluate its utility for the needs of domain repositories in the Earth and space sciences.

  12. Advancing User Supports with Structured How-To Knowledge Base for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Shen, S.; Acker, J. G.; Lynnes, C.; Lighty, L.; Beaty, T.; Kempler, S.

    2016-12-01

    It is a challenge to access and process fast growing Earth science data from satellites and numerical models, which may be archived in very different data format and structures. NASA data centers, managed by the Earth Observing System Data and Information System (EOSDIS), have developed a rich and diverse set of data services and tools with features intended to simplify finding, downloading, and working with these data. Although most data services and tools have user guides, many users still experience difficulties with accessing or reading data due to varying levels of familiarity with data services, tools, and/or formats. A type of structured online document, "data recipe", were created in beginning 2013 by Goddard Earth Science Data and Information Services Center (GES DISC). A data recipe is the "How-To" document created by using the fixed template, containing step-by-step instructions with screenshots and examples of accessing and working with real data. The recipes has been found to be very helpful, especially to first-time-users of particular data services, tools, or data products. Online traffic to the data recipe pages is significant to some recipes. In 2014, the NASA Earth Science Data System Working Group (ESDSWG) for data recipes was established, aimed to initiate an EOSDIS-wide campaign for leveraging the distributed knowledge within EOSDIS and its user communities regarding their respective services and tools. The ESDSWG data recipe group started with inventory and analysis of existing EOSDIS-wide online help documents, and provided recommendations and guidelines and for writing and grouping data recipes. This presentation will overview activities of creating How-To documents at GES DISC and ESDSWG. We encourage feedback and contribution from users for improving the data How-To knowledge base.

  13. Data Sharing & Publishing at Nature Publishing Group

    NASA Astrophysics Data System (ADS)

    VanDecar, J. C.; Hrynaszkiewicz, I.; Hufton, A. L.

    2015-12-01

    In recent years, the research community has come to recognize that upon-request data sharing has important limitations1,2. The Nature-titled journals feel that researchers have a duty to share data without undue qualifications, in a manner that allows others to replicate and build upon their published findings. Historically, the Nature journals have been strong supporters of data deposition in communities with existing data mandates, and have required data sharing upon request in all other cases. To help address some of the limitations of upon-request data sharing, the Nature titles have strengthened their existing data policies and forged a new partnership with Scientific Data, to promote wider data sharing in discoverable, citeable and reusable forms, and to ensure that scientists get appropriate credit for sharing3. Scientific Data is a new peer-reviewed journal for descriptions of research datasets, which works with a wide of range of public data repositories4. Articles at Scientific Data may either expand on research publications at other journals or may be used to publish new datasets. The Nature Publishing Group has also signed the Joint Declaration of Data Citation Principles5, and Scientific Data is our first journal to include formal data citations. We are currently in the process of adding data citation support to our various journals. 1 Wicherts, J. M., Borsboom, D., Kats, J. & Molenaar, D. The poor availability of psychological research data for reanalysis. Am. Psychol. 61, 726-728, doi:10.1037/0003-066x.61.7.726 (2006). 2 Vines, T. H. et al. Mandated data archiving greatly improves access to research data. FASEB J. 27, 1304-1308, doi:10.1096/fj.12-218164 (2013). 3 Data-access practices strengthened. Nature 515, 312, doi:10.1038/515312a (2014). 4 More bang for your byte. Sci. Data 1, 140010, doi:10.1038/sdata.2014.10 (2014). 5 Data Citation Synthesis Group: Joint Declaration of Data Citation Principles. (FORCE11, San Diego, CA, 2014).

  14. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    NASA Astrophysics Data System (ADS)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer opportunities to gain new insights from space missions and their vast data collections. We are working to innovate new architectures, exploit emerging technologies, develop new data-driven methodologies, and transfer them across disciplines, while working across the dual dimensions of the data lifecycle and the data ecosystem.

  15. Cultivating Data Expertise and Roles at a National Research Center

    NASA Astrophysics Data System (ADS)

    Thompson, C. A.

    2015-12-01

    As research becomes more computation and data-intensive, it brings new demands for staff that can manage complex data, design user services, and facilitate open access. Responding to these new demands, universities and research institutions are developing data services to support their scientists and scholarly communities. As more organizations extend their operations to research data, a better understanding of the staff roles and expertise required to support data-intensive research services is needed. What is data expertise - knowledge, skills, and roles? This study addresses this question through a case study of an exemplar research center, the National Center for Atmospheric Research (NCAR) in Boulder, CO. The NCAR case study results were supplemented and validated with a set of interviews of managers at additional geoscience data centers. To date, 11 interviews with NCAR staff and 19 interviews with managers at supplementary data centers have been completed. Selected preliminary results from the qualitative analysis will be reported in the poster: Data professionals have cultivated expertise in areas such as managing scientific data and products, understanding use and users, harnessing technology for data solutions, and standardizing metadata and data sets. Staff roles and responsibilities have evolved over the years to create new roles for data scientists, data managers/curators, data engineers, and senior managers of data teams, embedding data expertise into each NCAR lab. Explicit career paths and ladders for data professionals are limited but starting to emerge. NCAR has supported organization-wide efforts for data management, leveraging knowledge and best practices across all the labs and their staff. Based on preliminary results, NCAR provides a model for how organizations can build expertise and roles into their data service models. Data collection for this study is ongoing. The author anticipates that the results will help answer questions on what are the knowledge and skills required for data professionals and how organizations can develop data expertise.

  16. Systematically linking tranSMART, Galaxy and EGA for reusing human translational research data

    PubMed Central

    Zhang, Chao; Bijlard, Jochem; Staiger, Christine; Scollen, Serena; van Enckevort, David; Hoogstrate, Youri; Senf, Alexander; Hiltemann, Saskia; Repo, Susanna; Pipping, Wibo; Bierkens, Mariska; Payralbe, Stefan; Stringer, Bas; Heringa, Jaap; Stubbs, Andrew; Bonino Da Silva Santos, Luiz Olavo; Belien, Jeroen; Weistra, Ward; Azevedo, Rita; van Bochove, Kees; Meijer, Gerrit; Boiten, Jan-Willem; Rambla, Jordi; Fijneman, Remond; Spalding, J. Dylan; Abeln, Sanne

    2017-01-01

    The availability of high-throughput molecular profiling techniques has provided more accurate and informative data for regular clinical studies. Nevertheless, complex computational workflows are required to interpret these data. Over the past years, the data volume has been growing explosively, requiring robust human data management to organise and integrate the data efficiently. For this reason, we set up an ELIXIR implementation study, together with the Translational research IT (TraIT) programme, to design a data ecosystem that is able to link raw and interpreted data. In this project, the data from the TraIT Cell Line Use Case (TraIT-CLUC) are used as a test case for this system. Within this ecosystem, we use the European Genome-phenome Archive (EGA) to store raw molecular profiling data; tranSMART to collect interpreted molecular profiling data and clinical data for corresponding samples; and Galaxy to store, run and manage the computational workflows. We can integrate these data by linking their repositories systematically. To showcase our design, we have structured the TraIT-CLUC data, which contain a variety of molecular profiling data types, for storage in both tranSMART and EGA. The metadata provided allows referencing between tranSMART and EGA, fulfilling the cycle of data submission and discovery; we have also designed a data flow from EGA to Galaxy, enabling reanalysis of the raw data in Galaxy. In this way, users can select patient cohorts in tranSMART, trace them back to the raw data and perform (re)analysis in Galaxy. Our conclusion is that the majority of metadata does not necessarily need to be stored (redundantly) in both databases, but that instead FAIR persistent identifiers should be available for well-defined data ontology levels: study, data access committee, physical sample, data sample and raw data file. This approach will pave the way for the stable linkage and reuse of data. PMID:29123641

  17. Systematically linking tranSMART, Galaxy and EGA for reusing human translational research data.

    PubMed

    Zhang, Chao; Bijlard, Jochem; Staiger, Christine; Scollen, Serena; van Enckevort, David; Hoogstrate, Youri; Senf, Alexander; Hiltemann, Saskia; Repo, Susanna; Pipping, Wibo; Bierkens, Mariska; Payralbe, Stefan; Stringer, Bas; Heringa, Jaap; Stubbs, Andrew; Bonino Da Silva Santos, Luiz Olavo; Belien, Jeroen; Weistra, Ward; Azevedo, Rita; van Bochove, Kees; Meijer, Gerrit; Boiten, Jan-Willem; Rambla, Jordi; Fijneman, Remond; Spalding, J Dylan; Abeln, Sanne

    2017-01-01

    The availability of high-throughput molecular profiling techniques has provided more accurate and informative data for regular clinical studies. Nevertheless, complex computational workflows are required to interpret these data. Over the past years, the data volume has been growing explosively, requiring robust human data management to organise and integrate the data efficiently. For this reason, we set up an ELIXIR implementation study, together with the Translational research IT (TraIT) programme, to design a data ecosystem that is able to link raw and interpreted data. In this project, the data from the TraIT Cell Line Use Case (TraIT-CLUC) are used as a test case for this system. Within this ecosystem, we use the European Genome-phenome Archive (EGA) to store raw molecular profiling data; tranSMART to collect interpreted molecular profiling data and clinical data for corresponding samples; and Galaxy to store, run and manage the computational workflows. We can integrate these data by linking their repositories systematically. To showcase our design, we have structured the TraIT-CLUC data, which contain a variety of molecular profiling data types, for storage in both tranSMART and EGA. The metadata provided allows referencing between tranSMART and EGA, fulfilling the cycle of data submission and discovery; we have also designed a data flow from EGA to Galaxy, enabling reanalysis of the raw data in Galaxy. In this way, users can select patient cohorts in tranSMART, trace them back to the raw data and perform (re)analysis in Galaxy. Our conclusion is that the majority of metadata does not necessarily need to be stored (redundantly) in both databases, but that instead FAIR persistent identifiers should be available for well-defined data ontology levels: study, data access committee, physical sample, data sample and raw data file. This approach will pave the way for the stable linkage and reuse of data.

  18. Quarterly environmental data summary for first quarter 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-05-01

    In support of the Weldon Spring Site Remedial Action Project Federal Facilities Agreement, a copy of the Quarterly Environmental Data Summary (QEDS) for the first quarter of 1998 is enclosed. The data presented in this letter and attachment constitute the QEDS. The data were received from the contract laboratories, verified by the Weldon Spring Site verification group and, except for air monitoring data and site KPA generated data (uranium analyses), merged into the data base during the first quarter of 1998. Air monitoring data presented are the most recent complete sets of quarterly data. Air data are not stored inmore » the data base, and KPA data are not merged into the regular data base. Significant data, defined as data values that have exceeded defined {open_quotes}above normal{close_quotes} Level 2 values, are discussed in this letter for Environmental Monitoring Plan (EMP) generated data only. Above normal Level 2 values are based, in ES&H procedures, on historical high values, DOE Derived Concentration Guides (DCGs), NPDES limits and other guidelines. The procedures also establish actions to be taken in the event that {open_quotes}above normal{close_quotes} data occur. All data received and verified during the first quarter were within a permissible range of variability except for those detailed below. Above normal occurrences are cited for groundwater, air, and NPDES data. There were none for springs or surface water. The following discussion offers a brief summary of the data merged during the first quarter that exceeded the above normal criteria and updates on past reported above normal data. The attached tables present the most recent data for air and the data merged into the data base during the first quarter 1998 for groundwater, NPDES, surface water, and springs. Graphs showing concentrations of selected contaminants of concern at some of the critical locations have also been included in this QEDS. The graphs are discussed in the separate sections.« less

  19. GeoMapApp as a platform for visualizing marine data from Polar Regions

    NASA Astrophysics Data System (ADS)

    Nitsche, F. O.; Ryan, W. B.; Carbotte, S. M.; Ferrini, V.; Goodwillie, A. M.; O'hara, S. H.; Weissel, R.; McLain, K.; Chinhong, C.; Arko, R. A.; Chan, S.; Morton, J. J.; Pomeroy, D.

    2012-12-01

    To maximize the investment in expensive fieldwork the resulting data should be re-used as much as possible. In addition, unnecessary duplication of data collection effort should be avoided. This becomes even more important if access to field areas is as difficult and expensive as it is in Polar Regions. Making existing data discoverable in an easy to use platform is key to improve re-use and avoid duplication. A common obstacle is that use of existing data is often limited to specialists who know of the data existence and also have the right tools to view and analyze these data. GeoMapApp is a free, interactive, map based tool that allows users to discover, visualize, and analyze a large number of data sets. In addition to a global view, it provides polar map projections for displaying data in Arctic and Antarctic areas. Data that have currently been added to the system include Arctic swath bathymetry data collected from the USCG icebreaker Healy. These data are collected almost continuously including from cruises where bathymetry is not the main objective and for which existence of the acquired data may not be well known. In contrast, existence of seismic data from the Antarctic continental margin is well known in the seismic community. They are archived at and can be accessed through the Antarctic Seismic Data Library System (SDLS). Incorporating these data into GeoMapApp makes an even broader community aware of these data and the custom interface, which includes capabilities to visualize and explore these data, allows users without specific software or knowledge of the underlying data format to access the data. In addition to investigating these datasets, GeoMapApp provides links to the actual data sources to allow specialists the opportunity to re-use the original data. Important identification of data sources and data references are achieved on different levels. For access to the actual Antarctic seismic data GeoMapApp links to the SDLS site, where users have to register before downloading the data and where they are informed about data owners. For the swath bathymetry data GeoMapApp links to an IEDA/MGDS web page for each cruise containing detailed information about investigators and surveys.

  20. SeaDataNet - Pan-European infrastructure for marine and ocean data management: Unified access to distributed data sets

    NASA Astrophysics Data System (ADS)

    Schaap, D. M. A.; Maudire, G.

    2009-04-01

    SeaDataNet is an Integrated research Infrastructure Initiative (I3) in EU FP6 (2006 - 2011) to provide the data management system adapted both to the fragmented observation system and the users need for an integrated access to data, meta-data, products and services. Therefore SeaDataNet insures the long term archiving of the large number of multidisciplinary data (i.e. temperature, salinity current, sea level, chemical, physical and biological properties) collected by many different sensors installed on board of research vessels, satellite and the various platforms of the marine observing system. The SeaDataNet project started in 2006, but builds upon earlier data management infrastructure projects, undertaken over a period of 20 years by an expanding network of oceanographic data centres from the countries around all European seas. Its predecessor project Sea-Search had a strict focus on metadata. SeaDataNet maintains significant interest in the further development of the metadata infrastructure, but its primary objective is the provision of easy data access and generic data products. SeaDataNet is a distributed infrastructure that provides transnational access to marine data, meta-data, products and services through 40 interconnected Trans National Data Access Platforms (TAP) from 35 countries around the Black Sea, Mediterranean, North East Atlantic, North Sea, Baltic and Arctic regions. These include: National Oceanographic Data Centres (NODC's) Satellite Data Centres. Furthermore the SeaDataNet consortium comprises a number of expert modelling centres, SME's experts in IT, and 3 international bodies (ICES, IOC and JRC). Planning: The SeaDataNet project is delivering and operating the infrastructure in 3 versions: Version 0: maintenance and further development of the metadata systems developed by the Sea-Search project plus the development of a new metadata system for indexing and accessing to individual data objects managed by the SeaDataNet data centres. This is known as the Common Data Index (CDI) V0 system Version 1: harmonisation and upgrading of the metadatabases through adoption of the ISO 19115 metadata standard and provision of transparent data access and download services from all partner data centres through upgrading the Common Data Index and deployment of a data object delivery service. Version 2: adding data product services and OGC compliant viewing services and further virtualisation of data access. SeaDataNet Version 0: The SeaDataNet portal has been set up at http://www.seadatanet.org and it provides a platform for all SeaDataNet services and standards as well as background information about the project and its partners. It includes discovery services via the following catalogues: CSR - Cruise Summary Reports of research vessels; EDIOS - Locations and details of monitoring stations and networks / programmes; EDMED - High level inventory of Marine Environmental Data sets collected and managed by research institutes and organisations; EDMERP - Marine Environmental Research Projects ; EDMO - Marine Organisations. These catalogues are interrelated, where possible, to facilitate cross searching and context searching. These catalogues connect to the Common Data Index (CDI). Common Data Index (CDI) The CDI gives detailed insight in available datasets at partners databases and paves the way to direct online data access or direct online requests for data access / data delivery. The CDI V0 metadatabase contains more than 340.000 individual data entries from 36 CDI partners from 29 countries across Europe, covering a broad scope and range of data, held by these organisations. For purposes of standardisation and international exchange the ISO19115 metadata standard has been adopted. The CDI format is defined as a dedicated subset of this standard. A CDI XML format supports the exchange between CDI-partners and the central CDI manager, and ensures interoperability with other systems and networks. CDI XML entries are generated by participating data centres, directly from their databases. CDI-partners can make use of dedicated SeaDataNet Tools to generate CDI XML files automatically. Approach for SeaDataNet V1 and V2: The approach for SeaDataNet V1 and V2, which is in line with the INSPIRE Directive, comprises the following services: Discovery services = Metadata directories Security services = Authentication, Authorization & Accounting (AAA) Delivery services = Data access & downloading of datasets Viewing services = Visualisation of metadata, data and data products Product services = Generic and standard products Monitoring services = Statistics on usage and performance of the system Maintenance services = Updating of metadata by SeaDataNet partners The services will be operated over a distributed network of interconnected Data Centres accessed through a central Portal. In addition to service access the portal will provide information on data management standards, tools and protocols. The architecture has been designed to provide a coherent system based on V1 services, whilst leaving the pathway open for later extension with V2 services. For the implementation, a range of technical components have been defined. Some are already operational with the remainder in the final stages of development and testing. These make use of recent web technologies, and also comprise Java components, to provide multi-platform support and syntactic interoperability. To facilitate sharing of resources and interoperability, SeaDataNet has adopted SOAP Web Service technology. The SeaDataNet architecture and components have been designed to handle all kinds of oceanographic and marine environmental data including both in-situ measurements and remote sensing observations. The V1 technical development is ready and the V1 system is now being implemented and adopted by all participating data centres in SeaDataNet. Interoperability: Interoperability is the key to distributed data management system success and it is achieved in SeaDataNet V1 by: Using common quality control protocols and flag scale Using controlled vocabularies from a single source that have been developed using international content governance Adopting the ISO 19115 metadata standard for all metadata directories Providing XML Validation Services to quality control the metadata maintenance, including field content verification based on Schematron. Providing standard metadata entry tools Using harmonised Data Transport Formats (NetCDF, ODV ASCII and MedAtlas ASCII) for data sets delivery Adopting of OGC standards for mapping and viewing services Using SOAP Web Services in the SeaDataNet architecture SeaDataNet V1 Delivery Services: An important objective of the V1 system is to provide transparent access to the distributed data sets via a unique user interface at the SeaDataNet portal and download service. In the SeaDataNet V1 architecture the Common Data Index (CDI) V1 provides the link between discovery and delivery. The CDI user interface enables users to have a detailed insight of the availability and geographical distribution of marine data, archived at the connected data centres, and it provides the means for downloading data sets in common formats via a transaction mechanism. The SeaDataNet portal provides registered users access to these distributed data sets via the CDI V1 Directory and a shopping basket mechanism. This allows registered users to locate data of interest and submit their data requests. The requests are forwarded automatically from the portal to the relevant SeaDataNet data centres. This process is controlled via the Request Status Manager (RSM) Web Service at the portal and a Download Manager (DM) java software module, implemented at each of the data centres. The RSM also enables registered users to check regularly the status of their requests and download data sets, after access has been granted. Data centres can follow all transactions for their data sets online and can handle requests which require their consent. The actual delivery of data sets is done between the user and the selected data centre. The CDI V1 system is now being populated by all participating data centres in SeaDataNet, thereby phasing out CDI V0. 0.1 SeaDataNet Partners: IFREMER (France), MARIS (Netherlands), HCMR/HNODC (Greece), ULg (Belgium), OGS (Italy), NERC/BODC (UK), BSH/DOD (Germany), SMHI (Sweden), IEO (Spain), RIHMI/WDC (Russia), IOC (International), ENEA (Italy), INGV (Italy), METU (Turkey), CLS (France), AWI (Germany), IMR (Norway), NERI (Denmark), ICES (International), EC-DG JRC (International), MI (Ireland), IHPT (Portugal), RIKZ (Netherlands), RBINS/MUMM (Belgium), VLIZ (Belgium), MRI (Iceland), FIMR (Finland ), IMGW (Poland), MSI (Estonia), IAE/UL (Latvia), CMR (Lithuania), SIO/RAS (Russia), MHI/DMIST (Ukraine), IO/BAS (Bulgaria), NIMRD (Romania), TSU (Georgia), INRH (Morocco), IOF (Croatia), PUT (Albania), NIB (Slovenia), UoM (Malta), OC/UCY (Cyprus), IOLR (Israel), NCSR/NCMS (Lebanon), CNR-ISAC (Italy), ISMAL (Algeria), INSTM (Tunisia)

Top