The Role of Standards in Cloud-Computing Interoperability
2012-10-01
services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift
Image-Based Single Cell Profiling: High-Throughput Processing of Mother Machine Experiments
Sachs, Christian Carsten; Grünberger, Alexander; Helfrich, Stefan; Probst, Christopher; Wiechert, Wolfgang; Kohlheyer, Dietrich; Nöh, Katharina
2016-01-01
Background Microfluidic lab-on-chip technology combined with live-cell imaging has enabled the observation of single cells in their spatio-temporal context. The mother machine (MM) cultivation system is particularly attractive for the long-term investigation of rod-shaped bacteria since it facilitates continuous cultivation and observation of individual cells over many generations in a highly parallelized manner. To date, the lack of fully automated image analysis software limits the practical applicability of the MM as a phenotypic screening tool. Results We present an image analysis pipeline for the automated processing of MM time lapse image stacks. The pipeline supports all analysis steps, i.e., image registration, orientation correction, channel/cell detection, cell tracking, and result visualization. Tailored algorithms account for the specialized MM layout to enable a robust automated analysis. Image data generated in a two-day growth study (≈ 90 GB) is analyzed in ≈ 30 min with negligible differences in growth rate between automated and manual evaluation quality. The proposed methods are implemented in the software molyso (MOther machine AnaLYsis SOftware) that provides a new profiling tool to analyze unbiasedly hitherto inaccessible large-scale MM image stacks. Conclusion Presented is the software molyso, a ready-to-use open source software (BSD-licensed) for the unsupervised analysis of MM time-lapse image stacks. molyso source code and user manual are available at https://github.com/modsim/molyso. PMID:27661996
NASA Technical Reports Server (NTRS)
Buechler, W.; Tucker, A. G.
1981-01-01
Several methods were employed to detect both the occurrence and source of errors in the operational software of the AN/SLQ-32. A large embedded real time electronic warfare command and control system for the ROLM 1606 computer are presented. The ROLM computer provides information about invalid addressing, improper use of privileged instructions, stack overflows, and unimplemented instructions. Additionally, software techniques were developed to detect invalid jumps, indices out of range, infinte loops, stack underflows, and field size errors. Finally, data are saved to provide information about the status of the system when an error is detected. This information includes I/O buffers, interrupt counts, stack contents, and recently passed locations. The various errors detected, techniques to assist in debugging problems, and segment simulation on a nontarget computer are discussed. These error detection techniques were a major factor in the success of finding the primary cause of error in 98% of over 500 system dumps.
Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki
2012-01-01
Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.
NASA Astrophysics Data System (ADS)
Ahmadia, A. J.; Kees, C. E.
2014-12-01
Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and reproducibility for scientists working in the hydrological sciences. The HashDist documentation is available from: http://hashdist.readthedocs.org/en/latest/ HashDist is currently hosted at: https://github.com/hashdist/hashdist
Measurements of the LHCb software stack on the ARM architecture
NASA Astrophysics Data System (ADS)
Vijay Kartik, S.; Couturier, Ben; Clemencic, Marco; Neufeld, Niko
2014-06-01
The ARM architecture is a power-efficient design that is used in most processors in mobile devices all around the world today since they provide reasonable compute performance per watt. The current LHCb software stack is designed (and thus expected) to build and run on machines with the x86/x86_64 architecture. This paper outlines the process of measuring the performance of the LHCb software stack on the ARM architecture - specifically, the ARMv7 architecture on Cortex-A9 processors from NVIDIA and on full-fledged ARM servers with chipsets from Calxeda - and makes comparisons with the performance on x86_64 architectures on the Intel Xeon L5520/X5650 and AMD Opteron 6272. The paper emphasises the aspects of performance per core with respect to the power drawn by the compute nodes for the given performance - this ensures a fair real-world comparison with much more 'powerful' Intel/AMD processors. The comparisons of these real workloads in the context of LHCb are also complemented with the standard synthetic benchmarks HEPSPEC and Coremark. The pitfalls and solutions for the non-trivial task of porting the source code to build for the ARMv7 instruction set are presented. The specific changes in the build process needed for ARM-specific portions of the software stack are described, to serve as pointers for further attempts taken up by other groups in this direction. Cases where architecture-specific tweaks at the assembler lever (both in ROOT and the LHCb software stack) were needed for a successful compile are detailed - these cases are good indicators of where/how the software stack as well as the build system can be made more portable and multi-arch friendly. The experience gained from the tasks described in this paper are intended to i) assist in making an informed choice about ARM-based server solutions as a feasible low-power alternative to the current compute nodes, and ii) revisit the software design and build system for portability and generic improvements.
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
Zero-Copy Objects System software enables application data to be encapsulated in layers of communication protocol without being copied. Indirect referencing enables application source data, either in memory or in a file, to be encapsulated in place within an unlimited number of protocol headers and/or trailers. Zero-copy objects (ZCOs) are abstract data access representations designed to minimize I/O (input/output) in the encapsulation of application source data within one or more layers of communication protocol structure. They are constructed within the heap space of a Simple Data Recorder (SDR) data store to which all participating layers of the stack must have access. Each ZCO contains general information enabling access to the core source data object (an item of application data), together with (a) a linked list of zero or more specific extents that reference portions of this source data object, and (b) linked lists of protocol header and trailer capsules. The concatenation of the headers (in ascending stack sequence), the source data object extents, and the trailers (in descending stack sequence) constitute the transmitted data object constructed from the ZCO. This scheme enables a source data object to be encapsulated in a succession of protocol layers without ever having to be copied from a buffer at one layer of the protocol stack to an encapsulating buffer at a lower layer of the stack. For large source data objects, the savings in copy time and reduction in memory consumption may be considerable.
The use of Graphic User Interface for development of a user-friendly CRS-Stack software
NASA Astrophysics Data System (ADS)
Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah
2017-04-01
The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the parameter values for each operation. The knowledge of CRS-Stack processing procedure is still preserved in the software, which is easy and efficient to be learned. The software will still be developed in the future. Any new innovative seismic processing workflow will also be added into this GUI software.
An open-source wireless sensor stack: from Arduino to SDI-12 to Water One Flow
NASA Astrophysics Data System (ADS)
Hicks, S.; Damiano, S. G.; Smith, K. M.; Olexy, J.; Horsburgh, J. S.; Mayorga, E.; Aufdenkampe, A. K.
2013-12-01
Implementing a large-scale streaming environmental sensor network has previously been limited by the high cost of the datalogging and data communication infrastructure. The Christina River Basin Critical Zone Observatory (CRB-CZO) is overcoming the obstacles to large near-real-time data collection networks by using Arduino, an open source electronics platform, in combination with XBee ZigBee wireless radio modules. These extremely low-cost and easy-to-use open source electronics are at the heart of the new DIY movement and have provided solutions to countless projects by over half a million users worldwide. However, their use in environmental sensing is in its infancy. At present a primary limitation to widespread deployment of open-source electronics for environmental sensing is the lack of a simple, open-source software stack to manage streaming data from heterogeneous sensor networks. Here we present a functioning prototype software stack that receives sensor data over a self-meshing ZigBee wireless network from over a hundred sensors, stores the data locally and serves it on demand as a CUAHSI Water One Flow (WOF) web service. We highlight a few new, innovative components, including: (1) a versatile open data logger design based the Arduino electronics platform and ZigBee radios; (2) a software library implementing SDI-12 communication protocol between any Arduino platform and SDI12-enabled sensors without the need for additional hardware (https://github.com/StroudCenter/Arduino-SDI-12); and (3) 'midStream', a light-weight set of Python code that receives streaming sensor data, appends it with metadata on the fly by querying a relational database structured on an early version of the Observations Data Model version 2.0 (ODM2), and uses the WOFpy library to serve the data as WaterML via SOAP and REST web services.
Investigating interoperability of the LSST data management software stack with Astropy
NASA Astrophysics Data System (ADS)
Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.
HPC Software Stack Testing Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garvey, Cormac
The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).
Advancing global marine biogeography research with open-source GIS software and cloud-computing
Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick
2012-01-01
Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.
Integrating new Storage Technologies into EOS
NASA Astrophysics Data System (ADS)
Peters, Andreas J.; van der Ster, Dan C.; Rocha, Joaquim; Lensing, Paul
2015-12-01
The EOS[1] storage software was designed to cover CERN disk-only storage use cases in the medium-term trading scalability against latency. To cover and prepare for long-term requirements the CERN IT data and storage services group (DSS) is actively conducting R&D and open source contributions to experiment with a next generation storage software based on CEPH[3] and ethernet enabled disk drives. CEPH provides a scale-out object storage system RADOS and additionally various optional high-level services like S3 gateway, RADOS block devices and a POSIX compliant file system CephFS. The acquisition of CEPH by Redhat underlines the promising role of CEPH as the open source storage platform of the future. CERN IT is running a CEPH service in the context of OpenStack on a moderate scale of 1 PB replicated storage. Building a 100+PB storage system based on CEPH will require software and hardware tuning. It is of capital importance to demonstrate the feasibility and possibly iron out bottlenecks and blocking issues beforehand. The main idea behind this R&D is to leverage and contribute to existing building blocks in the CEPH storage stack and implement a few CERN specific requirements in a thin, customisable storage layer. A second research topic is the integration of ethernet enabled disks. This paper introduces various ongoing open source developments, their status and applicability.
Wearable Notification via Dissemination Service in a Pervasive Computing Environment
2015-09-01
context, state, and environment in a manner that would be transparent to a Soldier’s common operations. 15. SUBJECT TERMS pervasive computing, Android ...of user context shifts, i.e., changes in the user’s position, history , workflow, or resource interests. If the PCE is described as a 2-component...convenient viewing on the Glass’s screen just above the line of sight. All of the software developed uses Google’s Android open-source software stack
An Open Source Model for Open Access Journal Publication
Blesius, Carl R.; Williams, Michael A.; Holzbach, Ana; Huntley, Arthur C.; Chueh, Henry
2005-01-01
We describe an electronic journal publication infrastructure that allows a flexible publication workflow, academic exchange around different forms of user submissions, and the exchange of articles between publishers and archives using a common XML based standard. This web-based application is implemented on a freely available open source software stack. This publication demonstrates the Dermatology Online Journal's use of the platform for non-biased independent open access publication. PMID:16779183
Creating a Rackspace and NASA Nebula compatible cloud using the OpenStack project (Invited)
NASA Astrophysics Data System (ADS)
Clark, R.
2010-12-01
NASA and Rackspace have both provided technology to the OpenStack that allows anyone to create a private Infrastructure as a Service (IaaS) cloud using open source software and commodity hardware. OpenStack is designed and developed completely in the open and with an open governance process. NASA donated Nova, which powers the compute portion of NASA Nebula Cloud Computing Platform, and Rackspace donated Swift, which powers Rackspace Cloud Files. The project is now in continuous development by NASA, Rackspace, and hundreds of other participants. When you create a private cloud using Openstack, you will have the ability to easily interact with your private cloud, a government cloud, and an ecosystem of public cloud providers, using the same API.
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Jonestrask, L.; Tauxe, L.; Constable, C.
2016-12-01
The Magnetics Information Consortium (https://earthref.org/MagIC/) develops and maintains a database and web application for supporting the paleo-, geo-, and rock magnetic scientific community. Historically, this objective has been met with an Oracle database and a Perl web application at the San Diego Supercomputer Center (SDSC). The Oracle Enterprise Cluster at SDSC, however, was decommissioned in July of 2016 and the cost for MagIC to continue using Oracle became prohibitive. This provided MagIC with a unique opportunity to reexamine the entire technology stack and data model. MagIC has developed an open-source web application using the Meteor (http://meteor.com) framework and a MongoDB database. The simplicity of the open-source full-stack framework that Meteor provides has improved MagIC's development pace and the increased flexibility of the data schema in MongoDB encouraged the reorganization of the MagIC Data Model. As a result of incorporating actively developed open-source projects into the technology stack, MagIC has benefited from their vibrant software development communities. This has translated into a more modern web application that has significantly improved the user experience for the paleo-, geo-, and rock magnetic scientific community.
Muir, Dylan R; Kampa, Björn M
2014-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.
Muir, Dylan R.; Kampa, Björn M.
2015-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories1. PMID:25653614
Teaching Undergraduate Software Engineering Using Open Source Development Tools
2012-01-01
ware. Some example appliances are: a LAMP stack, Redmine, MySQL database, Moodle, Tom- cat on Apache, and Bugzilla. Some of the important features...Ada, C, C++, PHP , Py- thon, etc., and also supports a wide range of SDKs such as Google’s Android SDK and the Google Web Toolkit SDK. Additionally
Archiving Software Systems: Approaches to Preserve Computational Capabilities
NASA Astrophysics Data System (ADS)
King, T. A.
2014-12-01
A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.
Web Solutions Inspire Cloud Computing Software
NASA Technical Reports Server (NTRS)
2013-01-01
An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.
NASA Astrophysics Data System (ADS)
Cinquini, L.; Bell, G. M.; Williams, D.; Harney, J.
2012-12-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing state-of-the-art services for the management and access of Earth system data. ESGF is currently used to serve the totality of the model output used for the forthcoming IPCC 5th assessment report on climate change, as well as supporting observational and reanalysis datasets. Also, it is been adopted by several other projects that focus on global, regional and local climate modeling. The ESGF software stack is composed of several modular applications that cover related but disjoint areas of functionality: data publishing, data search and discovery, data access, user management, security, and federation. Overall, the ESGF infrastructure offers a configurable end-to-end solution to the problem of enabling web-based access to large amounts of geospatial data. This talk will present the architectural and configuration options that are available to a data provider leveraging ESGF to serve their data: which services to expose, how to scale to larger data collections, how to establish access control, how to customize the user interface, and others. Additionally, the framework provides extension points that allow each site to plug in custom functionality such as crawling of specific metadata repositories, exposing domain-specific analysis and visualization services, developing custom access clients that interact with the system APIs. These configuration and extension capabilities are based on simple but effective domain-specific object models, that underpin the software applications: the data model, the security model, and the federation model. The ESGF software stack is developed collaboratively by software engineers at many institutions around the world, and is made freely available to the community under an open source license to promote adoption, reuse, inspection and continuous improvement.
The Chandra Source Catalog 2.0: Data Processing Pipelines
NASA Astrophysics Data System (ADS)
Miller, Joseph; Allen, Christopher E.; Budynkiewicz, Jamie A.; Gibbs, Danny G., II; Paxson, Charles; Chen, Judy C.; Anderson, Craig S.; Burke, Douglas; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula
2018-01-01
With the construction of the Second Chandra Source Catalog (CSC2.0), came new requirements and new techniques to create a software system that can process 10,000 observations and identify nearly 320,000 point and compact X-ray sources. A new series of processing pipelines have been developed to allow for deeper more complete exploration of the Chanda observations. In CSC1.0 there were 4 general pipelines, whereas in CSC2.0 there are 20 data processing pipelines that have been organized into 3 distinct phases of operation - detection, master matching and source property characterization.With CSC2.0, observations within one arcminute of each other are stacked before searching for sources. The detection phase of processing combines the data, adjusts for shifts in fine astrometry, detects sources, and assesses the likelihood that sources are real. During the master source phase, detections across stacks of observations are analyzed for coverage of the same source to produce a master source. Finally, in the source property phase, each source is characterized with aperture photometry, spectrometry, variability and other properties at theobservation, stack and master levels over several energy bands.We present how these pipelines were constructed and the challenges we faced in how we processed data ranging from virtually no counts to millions of counts, how pipelines were tuned to work optimally on a computational cluster, and how we ensure the data produced was correct through various quality assurance steps.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
Bartolucci, Veronica
2017-01-01
This work presents a hardware/software data acquisition system developed for monitoring the temperature in real time of the cells in Air-Cooled Polymer Electrolyte Fuel Cells (AC-PEFC). These fuel cells are of great interest because they can carry out, in a single operation, the processes of oxidation and refrigeration. This allows reduction of weight, volume, cost and complexity of the control system in the AC-PEFC. In this type of PEFC (and in general in any PEFC), the reliable monitoring of temperature along the entire surface of the stack is fundamental, since a suitable temperature and a regular distribution thereof, are key for a better performance of the stack and a longer lifetime under the best operating conditions. The developed data acquisition (DAQ) system can perform non-intrusive temperature measurements of each individual cell of an AC-PEFC stack of any power (from watts to kilowatts). The stack power is related to the temperature gradient; i.e., a higher power corresponds to a higher stack surface, and consequently higher temperature difference between the coldest and the hottest point. The developed DAQ system has been implemented with the low-cost open-source platform Arduino, and it is completed with a modular virtual instrument that has been developed using NI LabVIEW. Temperature vs time evolution of all the cells of an AC-PEFC both together and individually can be registered and supervised. The paper explains comprehensively the developed DAQ system together with experimental results that demonstrate the suitability of the system. PMID:28698497
Segura, Francisca; Bartolucci, Veronica; Andújar, José Manuel
2017-07-09
This work presents a hardware/software data acquisition system developed for monitoring the temperature in real time of the cells in Air-Cooled Polymer Electrolyte Fuel Cells (AC-PEFC). These fuel cells are of great interest because they can carry out, in a single operation, the processes of oxidation and refrigeration. This allows reduction of weight, volume, cost and complexity of the control system in the AC-PEFC. In this type of PEFC (and in general in any PEFC), the reliable monitoring of temperature along the entire surface of the stack is fundamental, since a suitable temperature and a regular distribution thereof, are key for a better performance of the stack and a longer lifetime under the best operating conditions. The developed data acquisition (DAQ) system can perform non-intrusive temperature measurements of each individual cell of an AC-PEFC stack of any power (from watts to kilowatts). The stack power is related to the temperature gradient; i.e., a higher power corresponds to a higher stack surface, and consequently higher temperature difference between the coldest and the hottest point. The developed DAQ system has been implemented with the low-cost open-source platform Arduino, and it is completed with a modular virtual instrument that has been developed using NI LabVIEW. Temperature vs time evolution of all the cells of an AC-PEFC both together and individually can be registered and supervised. The paper explains comprehensively the developed DAQ system together with experimental results that demonstrate the suitability of the system.
Linux containers for fun and profit in HPC
Priedhorsky, Reid; Randles, Timothy C.
2017-10-01
This article outlines options for user-defined software stacks from an HPC perspective. Here, we argue that a lightweight approach based on Linux containers is most suitable for HPC centers because it provides the best balance between maximizing service of user needs and minimizing risks. We also discuss how containers work and several implementations, including Charliecloud, our own open-source solution developed at Los Alamos.
Linux containers for fun and profit in HPC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Priedhorsky, Reid; Randles, Timothy C.
This article outlines options for user-defined software stacks from an HPC perspective. Here, we argue that a lightweight approach based on Linux containers is most suitable for HPC centers because it provides the best balance between maximizing service of user needs and minimizing risks. We also discuss how containers work and several implementations, including Charliecloud, our own open-source solution developed at Los Alamos.
An automatic speech recognition system with speaker-independent identification support
NASA Astrophysics Data System (ADS)
Caranica, Alexandru; Burileanu, Corneliu
2015-02-01
The novelty of this work relies on the application of an open source research software toolkit (CMU Sphinx) to train, build and evaluate a speech recognition system, with speaker-independent support, for voice-controlled hardware applications. Moreover, we propose to use the trained acoustic model to successfully decode offline voice commands on embedded hardware, such as an ARMv6 low-cost SoC, Raspberry PI. This type of single-board computer, mainly used for educational and research activities, can serve as a proof-of-concept software and hardware stack for low cost voice automation systems.
An application of LOTEM around salt dome near Houston, Texas
NASA Astrophysics Data System (ADS)
Paembonan, Andri Yadi; Arjwech, Rungroj; Davydycheva, Sofia; Smirnov, Maxim; Strack, Kurt M.
2017-07-01
A salt dome is an important large geologic structure for hydrocarbon exploration. It may seal a porous reservoir of rocks that form petroleum reservoirs. Several techniques such as seismic, gravity, and electromagnetic including magnetotelluric have successfully yielded salt dome interpretation. Seismic has difficulties seeing through the salt because the seismic energy gets trapped by the salt due to its high velocity. Gravity and electromagnetics are more ideal methods. Long Offset Transient Electromagnetic (LOTEM) and Focused Source Electromagnetic (FSEM) were tested over a salt dome near Houston, Texas. LOTEM data were recorded at several stations with varying offset, and the FSEM tests were also made at some receiver locations near a suspected salt overhang. The data were processed using KMS's processing software: First, for assurance, including calibration and header checking; then transmitter and receiver data are merged and microseismic data is separated; Finally, data analysis and processing follows. LOTEM processing leads to inversion or in the FSEM case 3D modeling. Various 3D models verify the sensitivity under the salt dome. In addition, the processing was conducted pre-stack, stack, and post-stack. After pre-stacking, the noise was reduced, but showed the ringing effect due to a low-pass filter. Stacking and post-stacking with applying recursive average could reduce the Gibbs effect and produce smooth data.
The Geoinformatica free and open source software stack
NASA Astrophysics Data System (ADS)
Jolma, A.
2012-04-01
The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.
The Integration of CloudStack and OCCI/OpenNebula with DIRAC
NASA Astrophysics Data System (ADS)
Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan
2012-12-01
The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.
Technical report on the surface reconstruction of stacked contours by using the commercial software
NASA Astrophysics Data System (ADS)
Shin, Dong Sun; Chung, Min Suk; Hwang, Sung Bae; Park, Jin Seo
2007-03-01
After drawing and stacking contours of a structure, which is identified in the serially sectioned images, three-dimensional (3D) image can be made by surface reconstruction. Usually, software is composed for the surface reconstruction. In order to compose the software, medical doctors have to acquire the help of computer engineers. So in this research, surface reconstruction of stacked contours was tried by using commercial software. The purpose of this research is to enable medical doctors to perform surface reconstruction to make 3D images by themselves. The materials of this research were 996 anatomic images (1 mm intervals) of left lower limb, which were made by serial sectioning of a cadaver. On the Adobe Photoshop, contours of 114 anatomic structures were drawn, which were exported to Adobe Illustrator files. On the Maya, contours of each anatomic structure were stacked. On the Rhino, superoinferior lines were drawn along all stacked contours to fill quadrangular surfaces between contours. On the Maya, the contours were deleted. 3D images of 114 anatomic structures were assembled with their original locations preserved. With the surface reconstruction technique, developed in this research, medical doctors themselves could make 3D images of the serially sectioned images such as CTs and MRIs.
Bridging the Particle Physics and Big Data Worlds
NASA Astrophysics Data System (ADS)
Pivarski, James
2017-09-01
For decades, particle physicists have developed custom software because the scale and complexity of our problems were unique. In recent years, however, the ``big data'' industry has begun to tackle similar problems, and has developed some novel solutions. Incorporating scientific Python libraries, Spark, TensorFlow, and machine learning tools into the physics software stack can improve abstraction, reliability, and in some cases performance. Perhaps more importantly, it can free physicists to concentrate on domain-specific problems. Building bridges isn't always easy, however. Physics software and open-source software from industry differ in many incidental ways and a few fundamental ways. I will show work from the DIANA-HEP project to streamline data flow from ROOT to Numpy and Spark, to incorporate ideas of functional programming into histogram aggregation, and to develop real-time, query-style manipulations of particle data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laros, James H.; Grant, Ryan; Levenhagen, Michael J.
Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2013-12-01
A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.
Gopalakrishnan, V; Baskaran, R; Venkatraman, B
2016-08-01
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less
NASA Astrophysics Data System (ADS)
Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens
2015-04-01
Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.
NASA Astrophysics Data System (ADS)
Gerhardt, Lisa; Bhimji, Wahid; Canon, Shane; Fasel, Markus; Jacobsen, Doug; Mustafa, Mustafa; Porter, Jeff; Tsulaia, Vakho
2017-10-01
Bringing HEP computing to HPC can be difficult. Software stacks are often very complicated with numerous dependencies that are difficult to get installed on an HPC system. To address this issue, NERSC has created Shifter, a framework that delivers Docker-like functionality to HPC. It works by extracting images from native formats and converting them to a common format that is optimally tuned for the HPC environment. We have used Shifter to deliver the CVMFS software stack for ALICE, ATLAS, and STAR on the supercomputers at NERSC. As well as enabling the distribution multi-TB sized CVMFS stacks to HPC, this approach also offers performance advantages. Software startup times are significantly reduced and load times scale with minimal variation to 1000s of nodes. We profile several successful examples of scientists using Shifter to make scientific analysis easily customizable and scalable. We will describe the Shifter framework and several efforts in HEP and NP to use Shifter to deliver their software on the Cori HPC system.
A new software for dimensional measurements in 3D endodontic root canal instrumentation.
Sinibaldi, Raffaele; Pecci, Raffaella; Somma, Francesco; Della Penna, Stefania; Bedini, Rossella
2012-01-01
The main issue to be faced to get size estimates of 3D modification of the dental canal after endodontic treatment is the co-registration of the image stacks obtained through micro computed tomography (micro-CT) scans before and after treatment. Here quantitative analysis of micro-CT images have been performed by means of new dedicated software targeted to the analysis of root canal after endodontic instrumentation. This software analytically calculates the best superposition between the pre and post structures using the inertia tensor of the tooth. This strategy avoid minimization procedures, which can be user dependent, and time consuming. Once the co-registration have been achieved dimensional measurements have then been performed by contemporary evaluation of quantitative parameters over the two superimposed stacks of micro-CT images. The software automatically calculated the changes of volume, surface and symmetry axes in 3D occurring after the instrumentation. The calculation is based on direct comparison of the canal and canal branches selected by the user on the pre treatment image stack.
Pre-stack depth Migration imaging of the Hellenic Subduction Zone
NASA Astrophysics Data System (ADS)
Hussni, S. G.; Becel, A.; Schenini, L.; Laigle, M.; Dessa, J. X.; Galve, A.; Vitard, C.
2017-12-01
In 365 AD, a major M>8-tsunamignic earthquake occurred along the southwestern segment of the Hellenic subduction zone. Although this is the largest seismic event ever reported in Europe, some fundamental questions remain regarding the deep geometry of the interplate megathrust, as well as other faults within the overriding plate potentially connected to it. The main objective here is to image those deep structures, whose depths range between 15 and 45 km, using leading edge seismic reflection equipment. To this end, a 210-km-long multichannel seismic profile was acquired with the 8 km-long streamer and the 6600 cu.in source of R/V Marcus Langseth. This was realized at the end of 2015, during the SISMED cruise. This survey was made possible through a collective effort gathering labs (Géoazur, LDEO, ISTEP, ENS-Paris, EOST, LDO, Dpt. Geosciences of Pau Univ). A preliminary processing sequence has first been applied using Geovation software of CGG, which yielded a post-stack time migration of collected data, as well as pre-stack time migration obtained with a model derived from velocity analyses. Using Paradigm software, a pre-stack depth migration was subsequently carried out. This step required some tuning in the pre-processing sequence in order to improve multiple removal, noise suppression and to better reveal the true geometry of reflectors in depth. This iteration of pre-processing included, the use of parabolic Radon transform, FK filtering and time variant band pass filtering. An initial velocity model was built using depth-converted RMS velocities obtained from SISMED data for the sedimentary layer, complemented at depth with a smooth version of the tomographic velocities derived from coincident wide-angle data acquired during the 2012-ULYSSE survey. Then, we performed a Kirchhoff Pre-stack depth migration with traveltimes calculated using the Eikonal equation. Velocity model were then tuned through residual velocity analyses to flatten reflections in common reflection point gathers. These new results improve the imaging of deep reflectors and even reveal some new features. We will present this work, a comparison with our previously obtained post-stack time migration, as well as some insights into the new geological structures revealed by the depth imaging.
AtomicJ: An open source software for analysis of force curves
NASA Astrophysics Data System (ADS)
Hermanowicz, Paweł; Sarna, Michał; Burda, Kvetoslava; Gabryś, Halina
2014-06-01
We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laros III, James H.; DeBonis, David; Grant, Ryan
Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover themore » entire software space, from generic hardware interfaces to the input from the computer facility manager.« less
Bringing your tools to CyVerse Discovery Environment using Docker
Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric
2016-01-01
Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802
Bringing your tools to CyVerse Discovery Environment using Docker.
Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric
2016-01-01
Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.
Transient analysis of a solid oxide fuel cell stack with crossflow configuration
NASA Astrophysics Data System (ADS)
Yuan, P.; Liu, S. F.
2018-05-01
This study investigates the transient response of the cell temperature and current density of a solid oxide fuel cell having 6 stacks with crossflow configuration. A commercial software repeatedly solves the governing equations of each stack, and get the convergent results of the whole SOFC stack. The preliminary results indicate that the average current density of each stack is similar to others, so the power output between different stacks are uniform. Moreover, the average cell temperature among stacks is different, and the central stacks have higher temperature due to its harder heat dissipation. For the operating control, the cell temperature difference among stacks is worth to concern because the temperature difference will be over 10 °C in the analysis case. The increasing of the inlet flow rate of the fuel and air will short the transient state, increase the average current density, and drop the cell temperature difference among the stacks. Therefore, the inlet flow rate is an important factor for transient performance of a SOFC stack.
The Chandra Source Catalog 2.0: Interfaces
NASA Astrophysics Data System (ADS)
D'Abrusco, Raffaele; Zografou, Panagoula; Tibbetts, Michael; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Rafael; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Van Stone, David W.
2018-01-01
Easy-to-use, powerful public interfaces to access the wealth of information contained in any modern, complex astronomical catalog are fundamental to encourage its usage. In this poster,I present the public interfaces of the second Chandra Source Catalog (CSC2). CSC2 is the most comprehensive catalog of X-ray sources detected by Chandra, thanks to the inclusion of Chandra observations public through the end of 2014 and to methodological advancements. CSC2 provides measured properties for a large number of sources that sample the X-ray sky at fainter levels than the previous versions of the CSC, thanks to the stacking of single overlapping observations within 1’ before source detection. Sources from stacks are then crossmatched, if multiple stacks cover the same area of the sky, to create a list of unique, optimal CSC2 sources. The properties of sources detected in each single stack and each single observation are also measured. The layered structure of the CSC2 catalog is mirrored in the organization of the CSC2 database, consisting of three tables containing all properties for the unique stacked sources (“Master Source”), single stack sources (“Stack Source”) and sources in any single observation (“Observation Source”). These tables contain estimates of the position, flags, extent, significances, fluxes, spectral properties and variability (and associated errors) for all classes of sources. The CSC2 also includes source region and full-field data products for all master sources, stack sources and observation sources: images, photon event lists, light curves and spectra.CSCview, the main interface to the CSC2 source properties and data products, is a GUI tool that allows to build queries based on the values of all properties contained in CSC2 tables, query the catalog, inspect the returned table of source properties, browse and download the associated data products. I will also introduce the suite of command-line interfaces to CSC2 that can be used in alternative to CSCview, and will present the concept for an additional planned cone-search web-based interface.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk
2018-06-01
Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Fu, L.; West, P.; Zednik, S.; Fox, P. A.
2013-12-01
For simple portals such as vocabulary based services, which contain small amounts of data and require only hyper-textual representation, it is often an overkill to adopt the whole software stack of database, middleware and front end, or to use a general Web development framework as the starting point of development. Directly combining open source software is a much more favorable approach. However, our experience with the Coastal and Marine Spatial Planning Vocabulary (CMSPV) service portal shows that there are still issues such as system configuration and accommodating a new team member that need to be handled carefully. In this contribution, we share our experience in the context of the CMSPV portal, and focus on the tools and mechanisms we've developed to ease the configuration job and the incorporation process of new project members. We discuss the configuration issues that arise when we don't have complete control over how the software in use is configured and need to follow existing configuration styles that may not be well documented, especially when multiple pieces of such software need to work together as a combined system. As for the CMSPV portal, it is built on two pieces of open source software that are still under rapid development: a Fuseki data server and Epimorphics Linked Data API (ELDA) front end. Both lack mature documentation and tutorials. We developed comparison and labeling tools to ease the problem of system configuration. Another problem that slowed down the project is that project members came and went during the development process, so new members needed to start with a partially configured system and incomplete documentation left by old members. We developed documentation/tutorial maintenance mechanisms based on our comparison and labeling tools to make it easier for the new members to be incorporated into the project. These tools and mechanisms also provided benefit to other projects that reused the software components from the CMSPV system.
Design and validation of Segment--freely available software for cardiovascular image analysis.
Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan
2010-01-11
Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.
Sitaraman, Shivakumar; Ham, Young S.; Gharibyan, Narek; ...
2017-03-27
Here, fuel assemblies in the spent fuel pool are stored by suspending them in two vertically stacked layers at the Atucha Unit 1 nuclear power plant (Atucha-I). This introduces the unique problem of verifying the presence of fuel in either layer without physically moving the fuel assemblies. Given that the facility uses both natural uranium and slightly enriched uranium at 0.85 wt% 235U and has been in operation since 1974, a wide range of burnups and cooling times can exist in any given pool. A gross defect detection tool, the spent fuel neutron counter (SFNC), has been used at themore » site to verify the presence of fuel up to burnups of 8000 MWd/t. At higher discharge burnups, the existing signal processing software of the tool was found to fail due to nonlinearity of the source term with burnup.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitaraman, Shivakumar; Ham, Young S.; Gharibyan, Narek
Here, fuel assemblies in the spent fuel pool are stored by suspending them in two vertically stacked layers at the Atucha Unit 1 nuclear power plant (Atucha-I). This introduces the unique problem of verifying the presence of fuel in either layer without physically moving the fuel assemblies. Given that the facility uses both natural uranium and slightly enriched uranium at 0.85 wt% 235U and has been in operation since 1974, a wide range of burnups and cooling times can exist in any given pool. A gross defect detection tool, the spent fuel neutron counter (SFNC), has been used at themore » site to verify the presence of fuel up to burnups of 8000 MWd/t. At higher discharge burnups, the existing signal processing software of the tool was found to fail due to nonlinearity of the source term with burnup.« less
HyperCard--A Science Teaching Tool.
ERIC Educational Resources Information Center
Parker, Carol
1992-01-01
Discussion of new technological resources available for science instruction focuses on the use of the HyperCard software for the Macintosh to design customized materials. Topics addressed include general features of HyperCard, designing HyperCard stacks, graphics, and designing buttons (i.e., links for moving through the stacks). Several sample…
The Soil Stack: An Interactive Computer Program Describing Basic Soil Science and Soil Degradation.
ERIC Educational Resources Information Center
Cattle, S. R.; And Others
1995-01-01
A computer program dealing with numerous aspects of soil degradation has a target audience of high school and university students (16-20 year olds), and is presented in a series of cards grouped together as stacks. Describes use of the software in Australia. (LZ)
Ablation of film stacks in solar cell fabrication processes
Harley, Gabriel; Kim, Taeseok; Cousins, Peter John
2013-04-02
A dielectric film stack of a solar cell is ablated using a laser. The dielectric film stack includes a layer that is absorptive in a wavelength of operation of the laser source. The laser source, which fires laser pulses at a pulse repetition rate, is configured to ablate the film stack to expose an underlying layer of material. The laser source may be configured to fire a burst of two laser pulses or a single temporally asymmetric laser pulse within a single pulse repetition to achieve complete ablation in a single step.
Charliecloud: Unprivileged containers for user-defined software stacks in HPC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Priedhorsky, Reid; Randles, Timothy C.
Supercomputing centers are seeing increasing demand for user-defined software stacks (UDSS), instead of or in addition to the stack provided by the center. These UDSS support user needs such as complex dependencies or build requirements, externally required configurations, portability, and consistency. The challenge for centers is to provide these services in a usable manner while minimizing the risks: security, support burden, missing functionality, and performance. We present Charliecloud, which uses the Linux user and mount namespaces to run industry-standard Docker containers with no privileged operations or daemons on center resources. Our simple approach avoids most security risks while maintaining accessmore » to the performance and functionality already on offer, doing so in less than 500 lines of code. Charliecloud promises to bring an industry-standard UDSS user workflow to existing, minimally altered HPC resources.« less
Ffuzz: Towards full system high coverage fuzz testing on binary executables.
Zhang, Bin; Ye, Jiaxi; Bi, Xing; Feng, Chao; Tang, Chaojing
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.
Scalable cloud without dedicated storage
NASA Astrophysics Data System (ADS)
Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.
2015-05-01
We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.
40 CFR 51.164 - Stack height procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 51.164 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... source's stack height that exceeds good engineering practice or by any other dispersion technique, except... source based on a good engineering practice stack height that exceeds the height allowed by § 51.100(ii...
Opportunities for leveraging OS virtualization in high-end supercomputing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridges, Patrick G.; Pedretti, Kevin Thomas Tauke
2010-11-01
This paper examines potential motivations for incorporating virtualization support in the system software stacks of high-end capability supercomputers. We advocate that this will increase the flexibility of these platforms significantly and enable new capabilities that are not possible with current fixed software stacks. Our results indicate that compute, virtual memory, and I/O virtualization overheads are low and can be further mitigated by utilizing well-known techniques such as large paging and VMM bypass. Furthermore, since the addition of virtualization support does not affect the performance of applications using the traditional native environment, there is essentially no disadvantage to its addition.
2012-02-01
Software as a Service ( SaaS )— SaaS solutions involve conforming an organization’s...consumer of the utilized service . Service Models Software as a Service ( SaaS ) The capability provided to the consumer is to use the provider’s...operating system, platforms, and software installed. In contrast, Software as a Service ( SaaS ) abstracts the entire stack except for a few
1983-06-01
for DEC PDPll systems. MAINSAIL was developed and is marketed with a set of integrated tools for program development. The syntax of the language is...stack, and to test for stack-full and stack-empty conditions. This technique is useful in enforcing data integrity and in con- trolling concurrent...and market MAINSAIL. The language is distinguished by its portability. The same compiler and runtime system, both written in MAINSAIL, are the basis
Stacking-sequence optimization for buckling of laminated plates by integer programming
NASA Technical Reports Server (NTRS)
Haftka, Raphael T.; Walsh, Joanne L.
1991-01-01
Integer-programming formulations for the design of symmetric and balanced laminated plates under biaxial compression are presented. Both maximization of buckling load for a given total thickness and the minimization of total thickness subject to a buckling constraint are formulated. The design variables that define the stacking sequence of the laminate are zero-one integers. It is shown that the formulation results in a linear optimization problem that can be solved on readily available software. This is in contrast to the continuous case, where the design variables are the thicknesses of layers with specified ply orientations, and the optimization problem is nonlinear. Constraints on the stacking sequence such as a limit on the number of contiguous plies of the same orientation and limits on in-plane stiffnesses are easily accommodated. Examples are presented for graphite-epoxy plates under uniaxial and biaxial compression using a commercial software package based on the branch-and-bound algorithm.
National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents
NASA Astrophysics Data System (ADS)
Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.
2014-12-01
The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.
NASA Astrophysics Data System (ADS)
Prokhorov, V. B.; Fomenko, M. V.; Grigor'ev, I. V.
2012-06-01
Results from computer simulation of gas flow motion for gas conduits taken on one and two sides into the gas-removal shaft of a smoke stack with a constant cross section carried out using the SolidWorks and FlowVision application software packages are presented.
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
The NIH BD2K center for big data in translational genomics
Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; James Kent, W; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van’t Veer, Laura; Haussler, David
2015-01-01
The world’s genomics data will never be stored in a single repository – rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world’s genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM’s performance and utility. PMID:26174866
Numerical evaluation of an innovative cup layout for open volumetric solar air receivers
NASA Astrophysics Data System (ADS)
Cagnoli, Mattia; Savoldi, Laura; Zanino, Roberto; Zaversky, Fritz
2016-05-01
This paper proposes an innovative volumetric solar absorber design to be used in high-temperature air receivers of solar power tower plants. The innovative absorber, a so-called CPC-stacked-plate configuration, applies the well-known principle of a compound parabolic concentrator (CPC) for the first time in a volumetric solar receiver, heating air to high temperatures. The proposed absorber configuration is analyzed numerically, applying first the open-source ray-tracing software Tonatiuh in order to obtain the solar flux distribution on the absorber's surfaces. Next, a Computational Fluid Dynamic (CFD) analysis of a representative single channel of the innovative receiver is performed, using the commercial CFD software ANSYS Fluent. The solution of the conjugate heat transfer problem shows that the behavior of the new absorber concept is promising, however further optimization of the geometry will be necessary in order to exceed the performance of the classical absorber designs.
Acquisition of multiple image stacks with a confocal laser scanning microscope
NASA Astrophysics Data System (ADS)
Zuschratter, Werner; Steffen, Thomas; Braun, Katharina; Herzog, Andreas; Michaelis, Bernd; Scheich, Henning
1998-06-01
Image acquisition at high magnification is inevitably correlated with a limited view over the entire tissue section. To overcome this limitation we designed software for multiple image-stack acquisition (3D-MISA) in confocal laser scanning microscopy (CLSM). The system consists of a 4 channel Leica CLSM equipped with a high resolution z- scanning stage mounted on a xy-monitorized stage. The 3D- MISA software is implemented into the microscope scanning software and uses the microscope settings for the movements of the xy-stage. It allows storage and recall of 70 xyz- positions and the automatic 3D-scanning of image arrays between selected xyz-coordinates. The number of images within one array is limited only by the amount of disk space or memory available. Although for most applications the accuracy of the xy-scanning stage is sufficient for a precise alignment of tiled views, the software provides the possibility of an adjustable overlap between two image stacks by shifting the moving steps of the xy-scanning stage. After scanning a tiled image gallery of the extended focus-images of each channel will be displayed on a graphic monitor. In addition, a tiled image gallery of individual focal planes can be created. In summary, the 3D-MISA allows 3D-image acquisition of coherent regions in combination with high resolution of single images.
XPRESS: eXascale PRogramming Environment and System Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brightwell, Ron; Sterling, Thomas; Koniges, Alice
The XPRESS Project is one of four major projects of the DOE Office of Science Advanced Scientific Computing Research X-stack Program initiated in September, 2012. The purpose of XPRESS is to devise an innovative system software stack to enable practical and useful exascale computing around the end of the decade with near-term contributions to efficient and scalable operation of trans-Petaflops performance systems in the next two to three years; both for DOE mission-critical applications. To this end, XPRESS directly addresses critical challenges in computing of efficiency, scalability, and programmability through introspective methods of dynamic adaptive resource management and task scheduling.
Software Quality Control at Belle II
NASA Astrophysics Data System (ADS)
Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.;
2017-10-01
Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.
An Inverse Modeling Plugin for HydroDesktop using the Method of Anchored Distributions (MAD)
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio, C.; Over, M. W.; Rubin, Y.
2011-12-01
The CUAHSI Hydrologic Information System (HIS) software stack is based on an open and extensible architecture that facilitates the addition of new functions and capabilities at both the server side (using HydroServer) and the client side (using HydroDesktop). The HydroDesktop client plugin architecture is used here to expose a new scripting based plugin that makes use of the R statistics software as a means for conducting inverse modeling using the Method of Anchored Distributions (MAD). MAD is a Bayesian inversion technique for conditioning computational model parameters on relevant field observations yielding probabilistic distributions of the model parameters, related to the spatial random variable of interest, by assimilating multi-type and multi-scale data. The implementation of a desktop software tool for using the MAD technique is expected to significantly lower the barrier to use of inverse modeling in education, research, and resource management. The HydroDesktop MAD plugin is being developed following a community-based, open-source approach that will help both its adoption and long term sustainability as a user tool. This presentation will briefly introduce MAD, HydroDesktop, and the MAD plugin and software development effort.
Software-assisted stacking of gene modules using GoldenBraid 2.0 DNA-assembly framework.
Vazquez-Vilar, Marta; Sarrion-Perdigones, Alejandro; Ziarsolo, Peio; Blanca, Jose; Granell, Antonio; Orzaez, Diego
2015-01-01
GoldenBraid (GB) is a modular DNA assembly technology for plant multigene engineering based on type IIS restriction enzymes. GB speeds up the assembly of transcriptional units from standard genetic parts and facilitates the stacking of several genes within the same T-DNA in few days. GBcloning is software-assisted with a set of online tools. The GBDomesticator tool assists in the adaptation of DNA parts to the GBstandard. The combination of GB-adapted parts to build new transcriptional units is assisted by the GB TU Assembler tool. Finally, the assembly of multigene modules is simulated by the GB Binary Assembler. All the software tools are available at www.gbcloning.org . Here, we describe in detail the assembly methodology to create a multigene construct with three transcriptional units for polyphenol metabolic engineering in plants.
NASA Astrophysics Data System (ADS)
Smith, Joshua Wyatt; Stewart, Graeme A.; Seuster, Rolf; Quadt, Arnulf; ATLAS Collaboration
2017-10-01
This paper reports on the port of the ATLAS software stack onto new prototype ARM64 servers. This included building the “external” packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adjustments. A few additional modifications were needed to account for the different operating system, Ubuntu instead of Scientific Linux 6 / CentOS7. Selected results from the validation of the physics outputs on these ARM 64-bit servers will be shown. CPU, memory and IO intensive benchmarks using ATLAS specific environment and infrastructure have been performed, with a particular emphasis on the performance vs. energy consumption.
Building Geospatial Web Services for Ecological Monitoring and Forecasting
NASA Astrophysics Data System (ADS)
Hiatt, S. H.; Hashimoto, H.; Melton, F. S.; Michaelis, A. R.; Milesi, C.; Nemani, R. R.; Wang, W.
2008-12-01
The Terrestrial Observation and Prediction System (TOPS) at NASA Ames Research Center is a modeling system that generates a suite of gridded data products in near real-time that are designed to enhance management decisions related to floods, droughts, forest fires, human health, as well as crop, range, and forest production. While these data products introduce great possibilities for assisting management decisions and informing further research, realization of their full potential is complicated by their shear volume and by the need for a necessary infrastructure for remotely browsing, visualizing, and analyzing the data. In order to address these difficulties we have built an OGC-compliant WMS and WCS server based on an open source software stack that provides standardized access to our archive of data. This server is built using the open source Java library GeoTools which achieves efficient I/O and image rendering through Java Advanced Imaging. We developed spatio-temporal raster management capabilities using the PostGrid raster indexation engine. We provide visualization and browsing capabilities through a customized Ajax web interface derived from the kaMap project. This interface allows resource managers to quickly assess ecosystem conditions and identify significant trends and anomalies from within their web browser without the need to download source data or install special software. Our standardized web services also expose TOPS data to a range of potential clients, from web mapping applications to virtual globes and desktop GIS packages. However, support for managing the temporal dimension of our data is currently limited in existing software systems. Future work will attempt to overcome this shortcoming by building time-series visualization and analysis tools that can be integrated with existing geospatial software.
Self-service for software development projects and HPC activities
NASA Astrophysics Data System (ADS)
Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.
2014-05-01
This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.
Ffuzz: Towards full system high coverage fuzz testing on binary executables
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool—Ffuzz—on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently. PMID:29791469
NASA Astrophysics Data System (ADS)
Fisher, W. I.
2017-12-01
The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.
System for adding sulfur to a fuel cell stack system for improved fuel cell stability
Mukerjee, Subhasish; Haltiner, Jr., Karl J; Weissman, Jeffrey G
2013-08-13
A system for adding sulfur to a reformate stream feeding a fuel cell stack, having a sulfur source for providing sulfur to the reformate stream and a metering device in fluid connection with the sulfur source and the reformate stream. The metering device injects sulfur from the sulfur source to the reformate stream at a predetermined rate, thereby providing a conditioned reformate stream to the fuel cell stack. The system provides a conditioned reformate stream having a predetermined sulfur concentration that gives an acceptable balance of minimal drop in initial power with the desired maximum stability of operation over prolonged periods for the fuel cell stack.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.
2010-12-01
A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.
Use of containerisation as an alternative to full virtualisation in grid environments.
NASA Astrophysics Data System (ADS)
Long, Robin
2015-12-01
Virtualisation is a key tool on the grid. It can be used to provide varying work environments or as part of a cloud infrastructure. Virtualisation itself carries certain overheads that decrease the performance of the system through requiring extra resources to virtualise the software and hardware stack, and CPU-cycles wasted instantiating or destroying virtual machines for each job. With the rise and improvements in containerisation, where only the software stack is kept separate and no hardware or kernel virtualisation is used, there is scope for speed improvements and efficiency increases over standard virtualisation. We compare containerisation and virtualisation, including a comparison against bare-metal machines as a benchmark.
Queue and stack sorting algorithm optimization and performance analysis
NASA Astrophysics Data System (ADS)
Qian, Mingzhu; Wang, Xiaobao
2018-04-01
Sorting algorithm is one of the basic operation of a variety of software development, in data structures course specializes in all kinds of sort algorithm. The performance of the sorting algorithm is directly related to the efficiency of the software. A lot of excellent scientific research queue is constantly optimizing algorithm, algorithm efficiency better as far as possible, the author here further research queue combined with stacks of sorting algorithms, the algorithm is mainly used for alternating operation queue and stack storage properties, Thus avoiding the need for a large number of exchange or mobile operations in the traditional sort. Before the existing basis to continue research, improvement and optimization, the focus on the optimization of the time complexity of the proposed optimization and improvement, The experimental results show that the improved effectively, at the same time and the time complexity and space complexity of the algorithm, the stability study corresponding research. The improvement and optimization algorithm, improves the practicability.
User Driven Image Stacking for ODI Data and Beyond via a Highly Customizable Web Interface
NASA Astrophysics Data System (ADS)
Hayashi, S.; Gopu, A.; Young, M. D.; Kotulla, R.
2015-09-01
While some astronomical archives have begun serving standard calibrated data products, the process of producing stacked images remains a challenge left to the end-user. The benefits of astronomical image stacking are well established, and dither patterns are recommended for almost all observing targets. Some archives automatically produce stacks of limited scientific usefulness without any fine-grained user or operator configurability. In this paper, we present PPA Stack, a web based stacking framework within the ODI - Portal, Pipeline, and Archive system. PPA Stack offers a web user interface with built-in heuristics (based on pointing, filter, and other metadata information) to pre-sort images into a set of likely stacks while still allowing the user or operator complete control over the images and parameters for each of the stacks they wish to produce. The user interface, designed using AngularJS, provides multiple views of the input dataset and parameters, all of which are synchronized in real time. A backend consisting of a Python application optimized for ODI data, wrapped around the SWarp software, handles the execution of stacking workflow jobs on Indiana University's Big Red II supercomputer, and the subsequent ingestion of the combined images back into the PPA archive. PPA Stack is designed to enable seamless integration of other stacking applications in the future, so users can select the most appropriate option for their science.
Delivering Unidata Technology via the Cloud
NASA Astrophysics Data System (ADS)
Fisher, Ward; Oxelson Ganter, Jennifer
2016-04-01
Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.
Pryslak, N.E.
1974-02-26
A thermoelectric generator having a rigid coupling or stack'' between the heat source and the hot strap joining the thermoelements is described. The stack includes a member of an insulating material, such as ceramic, for electrically isolating the thermoelements from the heat source, and a pair of members of a ductile material, such as gold, one each on each side of the insulating member, to absorb thermal differential expansion stresses in the stack. (Official Gazette)
NASA Astrophysics Data System (ADS)
Massin, F.; Malcolm, A. E.
2017-12-01
Knowing earthquake source mechanisms gives valuable information for earthquake response planning and hazard mitigation. Earthquake source mechanisms can be analyzed using long period waveform inversion (for moderate size sources with sufficient signal to noise ratio) and body-wave first motion polarity or amplitude ratio inversion (for micro-earthquakes with sufficient data coverage). A robust approach that gives both source mechanisms and their associated probabilities across all source scales would greatly simplify the determination of source mechanisms and allow for more consistent interpretations of the results. Following previous work on shift and stack approaches, we develop such a probabilistic source mechanism analysis, using waveforms, which does not require polarity picking. For a given source mechanism, the first period of the observed body-waves is selected for all stations, multiplied by their corresponding theoretical polarity and stacked together. (The first period is found from a manually picked travel time by measuring the central period where the signal power is concentrated, using the second moment of the power spectral density function.) As in other shift and stack approaches, our method is not based on the optimization of an objective function through an inversion. Instead, the power of the polarity-corrected stack is a proxy for the likelihood of the trial source mechanism, with the most powerful stack corresponding to the most likely source mechanism. Using synthetic data, we test our method for robustness to the data coverage, coverage gap, signal to noise ratio, travel-time picking errors and non-double couple component. We then present results for field data in a volcano-tectonic context. Our results are reliable when constrained by 15 body-wavelets, with gap below 150 degrees, signal to noise ratio over 1 and arrival time error below a fifth of the period (0.2T) of the body-wave. We demonstrate that the source scanning approach for source mechanism analysis has similar advantages to waveform inversion (full waveform data, no manual intervention, probabilistic approach) and similar applicability to polarity inversion (any source size, any instrument type).
Brecko, Jonathan; Mathys, Aurore; Dekoninck, Wouter; Leponce, Maurice; VandenSpiegel, Didier; Semal, Patrick
2014-01-01
Abstract In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker). We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget. PMID:25589866
Brecko, Jonathan; Mathys, Aurore; Dekoninck, Wouter; Leponce, Maurice; VandenSpiegel, Didier; Semal, Patrick
2014-01-01
In this manuscript we present a focus stacking system, composed of commercial photographic equipment. The system is inexpensive compared to high-end commercial focus stacking solutions. We tested this system and compared the results with several different software packages (CombineZP, Auto-Montage, Helicon Focus and Zerene Stacker). We tested our final stacked picture with a picture obtained from two high-end focus stacking solutions: a Leica MZ16A with DFC500 and a Leica Z6APO with DFC290. Zerene Stacker and Helicon Focus both provided satisfactory results. However, Zerene Stacker gives the user more possibilities in terms of control of the software, batch processing and retouching. The outcome of the test on high-end solutions demonstrates that our approach performs better in several ways. The resolution of the tested extended focus pictures is much higher than those from the Leica systems. The flash lighting inside the Ikea closet creates an evenly illuminated picture, without struggling with filters, diffusers, etc. The largest benefit is the price of the set-up which is approximately € 3,000, which is 8 and 10 times less than the LeicaZ6APO and LeicaMZ16A set-up respectively. Overall, this enables institutions to purchase multiple solutions or to start digitising the type collection on a large scale even with a small budget.
Managing multiple image stacks from confocal laser scanning microscopy
NASA Astrophysics Data System (ADS)
Zerbe, Joerg; Goetze, Christian H.; Zuschratter, Werner
1999-05-01
A major goal in neuroanatomy is to obtain precise information about the functional organization of neuronal assemblies and their interconnections. Therefore, the analysis of histological sections frequently requires high resolution images in combination with an overview about the structure. To overcome this conflict we have previously introduced a software for the automatic acquisition of multiple image stacks (3D-MISA) in confocal laser scanning microscopy. Here, we describe a Windows NT based software for fast and easy navigation through the multiple images stacks (MIS-browser), the visualization of individual channels and layers and the selection of user defined subregions. In addition, the MIS browser provides useful tools for the visualization and evaluation of the datavolume, as for instance brightness and contrast corrections of individual layers and channels. Moreover, it includes a maximum intensity projection, panning and zoom in/out functions within selected channels or focal planes (x/y) and tracking along the z-axis. The import module accepts any tiff-format and reconstructs the original image arrangement after the user has defined the sequence of images in x/y and z and the number of channels. The implemented export module allows storage of user defined subregions (new single image stacks) for further 3D-reconstruction and evaluation.
Dislocation Ledge Sources: Dispelling the Myth of Frank-Read Source Importance
NASA Astrophysics Data System (ADS)
Murr, L. E.
2016-12-01
In the early 1960s, J.C.M. Li questioned the formation of dislocation pileups at grain boundaries, especially in high-stacking-fault free-energy fcc metals and alloys, and proposed grain boundary ledge sources for dislocations in contrast to Frank -Read sources. This article reviews these proposals and the evolution of compelling evidence for grain boundary or related interfacial ledge sources of dislocations in metals and alloys, including unambiguous observations using transmission electron microscopy. Such observations have allowed grain boundary ledge source emission profiles of dislocations to be quantified in 304 stainless steel (with a stacking-fault free energy of 23 mJ/m2) and nickel (with a stacking-fault free energy of 128 mJ/m2) as a function of engineering strain. The evidence supports the conclusion that FR dislocation sources are virtually absent in metal and alloy deformation with ledges at interfaces dominating as dislocation sources.
LLVM Infrastructure and Tools Project Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCormick, Patrick Sean
2017-11-06
This project works with the open source LLVM Compiler Infrastructure (http://llvm.org) to provide tools and capabilities that address needs and challenges faced by ECP community (applications, libraries, and other components of the software stack). Our focus is on providing a more productive development environment that enables (i) improved compilation times and code generation for parallelism, (ii) additional features/capabilities within the design and implementations of LLVM components for improved platform/performance portability and (iii) improved aspects related to composition of the underlying implementation details of the programming environment, capturing resource utilization, overheads, etc. -- including runtime systems that are often not easilymore » addressed by application and library developers.« less
NASA Astrophysics Data System (ADS)
Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.
2014-12-01
Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single metadata catalog. The entire BiG CZ Software system is being developed on public repositories as a modular suite of open source software projects. It will be built around a new Observations Data Model Version 2.0 (ODM2) that has been developed by members of the BiG CZ project team, with community input, under separate funding.
NASA Technical Reports Server (NTRS)
Poultney, S. K.; Brumfield, M. L.; Siviter, J. S.
1975-01-01
Typical pollutant gas concentrations at the stack exits of stationary sources can be estimated to be about 500 ppm under the present emission standards. Raman lidar has a number of advantages which makes it a valuable tool for remote measurements of these stack emissions. Tests of the Langley Research Center Raman lidar at a calibration tank indicate that night measurements of SO2 concentrations and stack opacity are possible. Accuracies of 10 percent are shown to be achievable from a distance of 300 m within 30 min integration times for 500 ppm SO2 at the stack exits. All possible interferences were examined quantitatively (except for the fluorescence of aerosols in actual stack emissions) and found to have negligible effect on the measurements. An early test at an instrumented stack is strongly recommended.
NASA Astrophysics Data System (ADS)
Mironov, Mikhail; Gusev, Vitalyi; Auregan, Yves; Lotton, Pierrick; Bruneau, Michel; Piatakov, Pavel
2002-08-01
It is demonstrated that the differentially heated stack, the heart of all thermoacoustic devices, provides a source of streaming additional to those associated with Reynolds stresses in quasi-unidirectional gas flow. This source of streaming is related to temperature-induced asymmetry in the generation of vortices and turbulence near the stack ends. The asymmetry of the hydrodynamic effects in an otherwise geometrically symmetric stack is due to the temperature difference between stack ends. The proposed mechanism of streaming excitation in annular thermoacoustic devices operates even in the absence of thermo-viscous interaction of sound waves with resonator walls. copyright 2002 Acoustical Society of America.
40 CFR 51.118 - Stack height provisions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... exceeds good engineering practice or by any other dispersion technique, except as provided in § 51.118(b... based on a good engineering practice stack height that exceeds the height allowed by § 51.100(ii) (1) or... actual stack height of any source. (b) The provisions of § 51.118(a) shall not apply to (1) stack heights...
Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.
Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars
2015-07-15
Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Pattern recognition monitoring of PEM fuel cell
Meltser, M.A.
1999-08-31
The CO-concentration in the H{sub 2} feed stream to a PEM fuel cell stack is monitored by measuring current and voltage behavior patterns from an auxiliary cell attached to the end of the stack. The auxiliary cell is connected to the same oxygen and hydrogen feed manifolds that supply the stack, and discharges through a constant load. Pattern recognition software compares the current and voltage patterns from the auxiliary cell to current and voltage signature determined from a reference cell similar to the auxiliary cell and operated under controlled conditions over a wide range of CO-concentrations in the H{sub 2} fuel stream. 4 figs.
Pattern recognition monitoring of PEM fuel cell
Meltser, Mark Alexander
1999-01-01
The CO-concentration in the H.sub.2 feed stream to a PEM fuel cell stack is monitored by measuring current and voltage behavior patterns from an auxiliary cell attached to the end of the stack. The auxiliary cell is connected to the same oxygen and hydrogen feed manifolds that supply the stack, and discharges through a constant load. Pattern recognition software compares the current and voltage patterns from the auxiliary cell to current and voltage signature determined from a reference cell similar to the auxiliary cell and operated under controlled conditions over a wide range of CO-concentrations in the H.sub.2 fuel stream.
Automating NEURON Simulation Deployment in Cloud Resources.
Stockton, David B; Santamaria, Fidel
2017-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.
Automating NEURON Simulation Deployment in Cloud Resources
Santamaria, Fidel
2016-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341
Barty, Anton; Kirian, Richard A.; Maia, Filipe R. N. C.; Hantke, Max; Yoon, Chun Hong; White, Thomas A.; Chapman, Henry
2014-01-01
The emerging technique of serial X-ray diffraction, in which diffraction data are collected from samples flowing across a pulsed X-ray source at repetition rates of 100 Hz or higher, has necessitated the development of new software in order to handle the large data volumes produced. Sorting of data according to different criteria and rapid filtering of events to retain only diffraction patterns of interest results in significant reductions in data volume, thereby simplifying subsequent data analysis and management tasks. Meanwhile the generation of reduced data in the form of virtual powder patterns, radial stacks, histograms and other meta data creates data set summaries for analysis and overall experiment evaluation. Rapid data reduction early in the analysis pipeline is proving to be an essential first step in serial imaging experiments, prompting the authors to make the tool described in this article available to the general community. Originally developed for experiments at X-ray free-electron lasers, the software is based on a modular facility-independent library to promote portability between different experiments and is available under version 3 or later of the GNU General Public License. PMID:24904246
Updates to FuncLab, a Matlab based GUI for handling receiver functions
NASA Astrophysics Data System (ADS)
Porritt, Robert W.; Miller, Meghan S.
2018-02-01
Receiver functions are a versatile tool commonly used in seismic imaging. Depending on how they are processed, they can be used to image discontinuity structure within the crust or mantle or they can be inverted for seismic velocity either directly or jointly with complementary datasets. However, modern studies generally require large datasets which can be challenging to handle; therefore, FuncLab was originally written as an interactive Matlab GUI to assist in handling these large datasets. This software uses a project database to allow interactive trace editing, data visualization, H-κ stacking for crustal thickness and Vp/Vs ratio, and common conversion point stacking while minimizing computational costs. Since its initial release, significant advances have been made in the implementation of web services and changes in the underlying Matlab platform have necessitated a significant revision to the software. Here, we present revisions to the software, including new features such as data downloading via irisFetch.m, receiver function calculations via processRFmatlab, on-the-fly cross-section tools, interface picking, and more. In the descriptions of the tools, we present its application to a test dataset in Michigan, Wisconsin, and neighboring areas following the passage of USArray Transportable Array. The software is made available online at https://robporritt.wordpress.com/software.
NASA Astrophysics Data System (ADS)
Hain, Roger; Allen, Christopher E.; Anderson, Craig S.; Budynkiewicz, Jamie A.; Burke, Douglas; Chen, Judy C.; Civano, Francesca Maria; D'Abrusco, Raffaele; Doe, Stephen M.; Evans, Ian N.; Evans, Janet D.; Fabbiano, Giuseppina; Gibbs, Danny G., II; Glotfelty, Kenny J.; Graessle, Dale E.; Grier, John D.; Hall, Diane M.; Harbo, Peter N.; Houck, John C.; Lauer, Jennifer L.; Laurino, Omar; Lee, Nicholas P.; Martínez-Galarza, Juan Rafael; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph; McLaughlin, Warren; Morgan, Douglas L.; Mossman, Amy E.; Nguyen, Dan T.; Nichols, Joy S.; Nowak, Michael A.; Paxson, Charles; Plummer, David A.; Primini, Francis Anthony; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael; Van Stone, David W.; Zografou, Panagoula
2018-01-01
The Second Chandra Source Catalog (CSC2.0) combines data at multiple stages to improve detection efficiency, enhance source region identification, and match observations of the same celestial source taken with significantly different point spread functions on Chandra's detectors. The need to group data for different reasons at different times in processing results in a hierarchy of groups to which individual sources belong. Source data are initially identified as belonging to each Chandra observation ID and number (an "obsid"). Data from each obsid whose pointings are within sixty arcseconds of each other are reprojected to the same aspect reference coordinates and grouped into stacks. Detection is performed on all data in the same stack, and individual sources are identified. Finer source position and region data are determined by further processing sources whose photons may be commingled together, grouping such sources into bundles. Individual stacks which overlap to any extent are grouped into ensembles, and all stacks in the same ensemble are later processed together to identify master sources and determine their properties.We discuss the basis for the various methods of combining data for processing and precisely define how the groups are determined. We also investigate some of the issues related to grouping data and discuss what options exist and how groups have evolved from prior releases.This work has been supported by NASA under contract NAS 8-03060 to the Smithsonian Astrophysical Observatory for operation of the Chandra X-ray Center.
FPGA-Based Optical Cavity Phase Stabilization for Coherent Pulse Stacking
Xu, Yilun; Wilcox, Russell; Byrd, John; ...
2017-11-20
Coherent pulse stacking (CPS) is a new time-domain coherent addition technique that stacks several optical pulses into a single output pulse, enabling high pulse energy from fiber lasers. We develop a robust, scalable, and distributed digital control system with firmware and software integration for algorithms, to support the CPS application. We model CPS as a digital filter in the Z domain and implement a pulse-pattern-based cavity phase detection algorithm on an field-programmable gate array (FPGA). A two-stage (2+1 cavities) 15-pulse stacking system achieves an 11.0 peak-power enhancement factor. Each optical cavity is fed back at 1.5kHz, and stabilized at anmore » individually-prescribed round-trip phase with 0.7deg and 2.1deg rms phase errors for Stages 1 and 2, respectively. Optical cavity phase control with nanometer accuracy ensures 1.2% intensity stability of the stacked pulse over 12 h. The FPGA-based feedback control system can be scaled to large numbers of optical cavities.« less
FPGA-Based Optical Cavity Phase Stabilization for Coherent Pulse Stacking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Yilun; Wilcox, Russell; Byrd, John
Coherent pulse stacking (CPS) is a new time-domain coherent addition technique that stacks several optical pulses into a single output pulse, enabling high pulse energy from fiber lasers. We develop a robust, scalable, and distributed digital control system with firmware and software integration for algorithms, to support the CPS application. We model CPS as a digital filter in the Z domain and implement a pulse-pattern-based cavity phase detection algorithm on an field-programmable gate array (FPGA). A two-stage (2+1 cavities) 15-pulse stacking system achieves an 11.0 peak-power enhancement factor. Each optical cavity is fed back at 1.5kHz, and stabilized at anmore » individually-prescribed round-trip phase with 0.7deg and 2.1deg rms phase errors for Stages 1 and 2, respectively. Optical cavity phase control with nanometer accuracy ensures 1.2% intensity stability of the stacked pulse over 12 h. The FPGA-based feedback control system can be scaled to large numbers of optical cavities.« less
TkPl_SU: An Open-source Perl Script Builder for Seismic Unix
NASA Astrophysics Data System (ADS)
Lorenzo, J. M.
2017-12-01
TkPl_SU (beta) is a graphical user interface (GUI) to select parameters for Seismic Unix (SU) modules. Seismic Unix (Stockwell, 1999) is a widely distributed free software package for processing seismic reflection and signal processing. Perl/Tk is a mature, well-documented and free object-oriented graphical user interface for Perl. In a classroom environment, shell scripting of SU modules engages students and helps focus on the theoretical limitations and strengths of signal processing. However, complex interactive processing stages, e.g., selection of optimal stacking velocities, killing bad data traces, or spectral analysis requires advanced flows beyond the scope of introductory classes. In a research setting, special functionality from other free seismic processing software such as SioSeis (UCSD-NSF) can be incorporated readily via an object-oriented style to programming. An object oriented approach is a first step toward efficient extensible programming of multi-step processes, and a simple GUI simplifies parameter selection and decision making. Currently, in TkPl_SU, Perl 5 packages wrap 19 of the most common SU modules that are used in teaching undergraduate and first-year graduate student classes (e.g., filtering, display, velocity analysis and stacking). Perl packages (classes) can advantageously add new functionality around each module and clarify parameter names for easier usage. For example, through the use of methods, packages can isolate the user from repetitive control structures, as well as replace the names of abbreviated parameters with self-describing names. Moose, an extension of the Perl 5 object system, greatly facilitates an object-oriented style. Perl wrappers are self-documenting via Perl programming document markup language.
40 CFR 61.44 - Stack sampling.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 8 2010-07-01 2010-07-01 false Stack sampling. 61.44 Section 61.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL... Firing § 61.44 Stack sampling. (a) Sources subject to § 61.42(b) shall be continuously sampled, during...
40 CFR 61.44 - Stack sampling.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 8 2011-07-01 2011-07-01 false Stack sampling. 61.44 Section 61.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL... Firing § 61.44 Stack sampling. (a) Sources subject to § 61.42(b) shall be continuously sampled, during...
Hardware Evaluation of the Horizontal Exercise Fixture with Weight Stack
NASA Technical Reports Server (NTRS)
Newby, Nate; Leach, Mark; Fincke, Renita; Sharp, Carwyn
2009-01-01
HEF with weight stack seems to be a very sturdy and reliable exercise device that should function well in a bed rest training setting. A few improvements should be made to both the hardware and software to improve usage efficiency, but largely, this evaluation has demonstrated HEF's robustness. The hardware offers loading to muscles, bones, and joints, potentially sufficient to mitigate the loss of muscle mass and bone mineral density during long-duration bed rest campaigns. With some minor modifications, the HEF with weight stack equipment provides the best currently available means of performing squat, heel raise, prone row, bench press, and hip flexion/extension exercise in a supine orientation.
A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing
NASA Astrophysics Data System (ADS)
Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.
2012-04-01
Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking can benefit largely from the hardware parallelism provided by the cloud deployment. The resulting output, post-stack section, coherence, and NMO-velocity panels are used to generate a smooth migration-velocity model. Residual static corrections are calculated as a by-product of the stack and can be applied iteratively. As a final step, a time migrated subsurface image is obtained by a parallelized Kirchhoff time migration scheme. Processing can be done step-by-step or using a graphical workflow editor that can launch a series of pipelined tasks. The status of the submitted jobs is monitored by a dedicated service. All results are stored in project directories, where they can be downloaded of viewed directly in the browser. Currently, the portal has access to three research clusters having a total number of 70 nodes with 4 cores each. They are shared with four other cloud-computing applications bundled within the GRIDA3 project. To demonstrate the functionality of our "seismic cloud lab", we will present results obtained for three different types of data, all taken from hydrogeophysical studies: (1) a seismic reflection data set, made of compressional waves from explosive sources, recorded in Muravera, Sardinia; (2) a shear-wave data set from, Sardinia; (3) a multi-offset Ground-Penetrating-Radar data set from Larreule, France. The presented work was funded by the government of the Autonomous Region of Sardinia and by the Italian Ministry of Research and Education.
Anode reactive bleed and injector shift control strategy
Cai, Jun [Rochester, NY; Chowdhury, Akbar [Pittsford, NY; Lerner, Seth E [Honeoye Falls, NY; Marley, William S [Rush, NY; Savage, David R [Rochester, NY; Leary, James K [Rochester, NY
2012-01-03
A system and method for correcting a large fuel cell voltage spread for a split sub-stack fuel cell system. The system includes a hydrogen source that provides hydrogen to each split sub-stack and bleed valves for bleeding the anode side of the sub-stacks. The system also includes a voltage measuring device for measuring the voltage of each cell in the split sub-stacks. The system provides two levels for correcting a large stack voltage spread problem. The first level includes sending fresh hydrogen to the weak sub-stack well before a normal reactive bleed would occur, and the second level includes sending fresh hydrogen to the weak sub-stack and opening the bleed valve of the other sub-stack when the cell voltage spread is close to stack failure.
Phosphorus and nitrogen losses from poultry litter stacks and leaching through soils
USDA-ARS?s Scientific Manuscript database
The practice of stacking poultry litter in fields prior to spreading provides important logistical benefits to farmers but is controversial due to its potential to serve as a source of nutrients to leachate and runoff. We evaluated nutrient fate under stacked poultry litter to assess differences in ...
Motion Imagery and Robotics Application (MIRA)
NASA Technical Reports Server (NTRS)
Martinez, Lindolfo; Rich, Thomas
2011-01-01
Objectives include: I. Prototype a camera service leveraging the CCSDS Integrated protocol stack (MIRA/SM&C/AMS/DTN): a) CCSDS MIRA Service (New). b) Spacecraft Monitor and Control (SM&C). c) Asynchronous Messaging Service (AMS). d) Delay/Disruption Tolerant Networking (DTN). II. Additional MIRA Objectives: a) Demo of Camera Control through ISS using CCSDS protocol stack (Berlin, May 2011). b) Verify that the CCSDS standards stack can provide end-to-end space camera services across ground and space environments. c) Test interoperability of various CCSDS protocol standards. d) Identify overlaps in the design and implementations of the CCSDS protocol standards. e) Identify software incompatibilities in the CCSDS stack interfaces. f) Provide redlines to the SM&C, AMS, and DTN working groups. d) Enable the CCSDS MIRA service for potential use in ISS Kibo camera commanding. e) Assist in long-term evolution of this entire group of CCSDS standards to TRL 6 or greater.
2000W high beam quality diode laser for direct materials processing
NASA Astrophysics Data System (ADS)
Qin, Wen-bin; Liu, You-qiang; Cao, Yin-hua; Gao, Jing; Pan, Fei; Wang, Zhi-yong
2011-11-01
This article describes high beam quality and kilowatt-class diode laser system for direct materials processing, using optical design software ZEMAX® to simulate the diode laser optical path, including the beam shaping, collimation, coupling, focus, etc.. In the experiment, the diode laser stack of 808nm and the diode laser stack of 915nm were used for the wavelength coupling, which were built vertical stacks up to 16 bars. The threshold current of the stack is 6.4A, the operating current is 85A and the output power is 1280W. Through experiments, after collimating the diode laser beam with micro-lenses, the fast axis BPP of the stack is less than 60mm.mrad, and the slow-axis BPP of the stack is less than 75mm.mrad. After shaping the laser beam and improving the beam quality, the fast axis BPP of the stack is still 60mm.mrad, and the slow-axis BPP of the stack is less than 19mm.mrad. After wavelength coupling and focusing, ultimately the power of 2150W was obtained, focal spot size of 1.5mm * 1.2mm with focal length 300mm. The laser power density is 1.2×105W/cm2, and that can be used for metal remelting, alloying, cladding and welding. The total optical coupling conversion efficiency is 84%, and the total electrical - optical conversion efficiency is 50%.
Hanna, Matthew G; Monaco, Sara E; Cuda, Jacqueline; Xing, Juan; Ahmed, Ishtiaque; Pantanowitz, Liron
2017-09-01
Whole-slide imaging in cytology is limited when glass slides are digitized without z-stacks for focusing. Different vendors have started to provide z-stacking solutions to overcome this limitation. The Panoptiq imaging system allows users to create digital files combining low-magnification panoramic images with regions of interest (ROIs) that are imaged with high-magnification z-stacks. The aim of this study was to compare such panoramic images with conventional whole-slide images and glass slides for the tasks of screening and interpretation in cytopathology. Thirty glass slides, including 10 ThinPrep Papanicolaou tests and 20 nongynecologic cytology cases, were digitized with an Olympus BX45 integrated microscope with an attached Prosilica GT camera. ViewsIQ software was used for image acquisition and viewing. These glass slides were also scanned on an Aperio ScanScope XT at ×40 (0.25 μm/pixel) with 1 z-plane and were viewed with ImageScope software. Digital and glass sides were screened and dotted/annotated by a cytotechnologist and were subsequently reviewed by 3 cytopathologists. For panoramic images, the cytotechnologist manually created digital maps and selected representative ROIs to generate z-stacks at a higher magnification. After 3-week washout periods, panoramic images were compared with Aperio digital slides and glass slides. The Panoptiq system permitted fine focusing of thick smears and cell clusters. In comparison with glass slides, the average screening times were 5.5 and 1.8 times longer with Panoptiq and Aperio images, respectively, but this improved with user experience. There was no statistical difference in diagnostic concordance between all 3 modalities. Users' diagnostic confidence was also similar for all modalities. The Aperio whole-slide scanner with 1 z-plane scanning and the Panoptiq imaging system with z-stacking are both suitable for cytopathology screening and interpretation. However, ROI z-stacks do offer a superior mechanism for overcoming focusing problems commonly encountered with digital cytology slides. Unlike whole-slide imaging, the acquisition of representative z-stack images with the Panoptiq system requires a trained cytologist to create digital files. Cancer Cytopathol 2017;125:701-9. © 2017 American Cancer Society. © 2017 American Cancer Society.
Scaling Semantic Graph Databases in Size and Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morari, Alessandro; Castellana, Vito G.; Villa, Oreste
In this paper we present SGEM, a full software system for accelerating large-scale semantic graph databases on commodity clusters. Unlike current approaches, SGEM addresses semantic graph databases by only employing graph methods at all the levels of the stack. On one hand, this allows exploiting the space efficiency of graph data structures and the inherent parallelism of graph algorithms. These features adapt well to the increasing system memory and core counts of modern commodity clusters. On the other hand, however, these systems are optimized for regular computation and batched data transfers, while graph methods usually are irregular and generate fine-grainedmore » data accesses with poor spatial and temporal locality. Our framework comprises a SPARQL to data parallel C compiler, a library of parallel graph methods and a custom, multithreaded runtime system. We introduce our stack, motivate its advantages with respect to other solutions and show how we solved the challenges posed by irregular behaviors. We present the result of our software stack on the Berlin SPARQL benchmarks with datasets up to 10 billion triples (a triple corresponds to a graph edge), demonstrating scaling in dataset size and in performance as more nodes are added to the cluster.« less
40 CFR 98.173 - Calculating GHG emissions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... associated requirements for Tier 4 in subpart C of this part (General Stationary Fuel Combustion Sources). (b... stack gas volumetric flow rate (scfh). %H2O = Hourly moisture percentage in the stack gas. (iii) You... Tier 4 methodology in subpart C of this part, or through the same stack as any combustion unit or...
Open Source Hardware for DIY Environmental Sensing
NASA Astrophysics Data System (ADS)
Aufdenkampe, A. K.; Hicks, S. D.; Damiano, S. G.; Montgomery, D. S.
2014-12-01
The Arduino open source electronics platform has been very popular within the DIY (Do It Yourself) community for several years, and it is now providing environmental science researchers with an inexpensive alternative to commercial data logging and transmission hardware. Here we present the designs for our latest series of custom Arduino-based dataloggers, which include wireless communication options like self-meshing radio networks and cellular phone modules. The main Arduino board uses a custom interface board to connect to various research-grade sensors to take readings of turbidity, dissolved oxygen, water depth and conductivity, soil moisture, solar radiation, and other parameters. Sensors with SDI-12 communications can be directly interfaced to the logger using our open Arduino-SDI-12 software library (https://github.com/StroudCenter/Arduino-SDI-12). Different deployment options are shown, like rugged enclosures to house the loggers and rigs for mounting the sensors in both fresh water and marine environments. After the data has been collected and transmitted by the logger, the data is received by a mySQL-PHP stack running on a web server that can be accessed from anywhere in the world. Once there, the data can be visualized on web pages or served though REST requests and Water One Flow (WOF) services. Since one of the main benefits of using open source hardware is the easy collaboration between users, we are introducing a new web platform for discussion and sharing of ideas and plans for hardware and software designs used with DIY environmental sensors and data loggers.
High-brightness diode pump sources for solid-state and fiber laser pumping across 8xx-9xx nm range
NASA Astrophysics Data System (ADS)
Diamant, Ronen; Berk, Yuri; Cohen, Shalom; Klumel, Genady; Levy, Moshe; Openhaim, Yaki; Peleg, Ophir; Yanson, Dan; Karni, Yoram
2011-06-01
Advanced solid state laser architectures place increasingly demanding requirements on high-brightness, low-cost QCW laser diode pump sources, with custom apertures both for side and end rod pumping configurations. To meet this need, a new series of scalable QCW pump sources at 808nm and 940nm was developed. The stacks, available in multiple output formats, allow for custom aperture filling by varying both the length and quantity of stacked laser bars. For these products, we developed next-generation laser bars based on improved epitaxial wafer designs delivering power densities of 20W/mm of emission aperture. With >200W of peak QCW power available from a full-length 1cm bar, we have demonstrated power scaling to over 2kW in 10-bar stacks with 55% wall plug efficiency. We also present the design and performance of several stack configurations using full-length and reduced-length (mini) bars that demonstrate the versatility of both the bar and packaging designs. We illustrate how the ROBUST HEAD packaging technology developed at SCD is capable of accommodating variable bar length, pitch and quantity for custom rod pumping geometries. The excellent all-around performance of the stacks is supported by reliability data in line with the previously reported 20 Gshot space-grade qualification of SCD's stacks.
Yu-Mei Hsu; Andrzej Bytnerowicz
2015-01-01
NO2 and SO2 are the primary pollutants produced by industrial facilities of the Athabasca Oil sand Region (AOSR), Alberta, Canada. The major emission sources are the upgrader stacks for SO2 and stacks, mine fleets and vehicles for NO2. After emitting from the sources, NO
Note: O-ring stack system for electron gun alignment.
Park, In-Yong; Cho, Boklae; Han, Cheolsu; Shin, Seungmin; Lee, Dongjun; Ahn, Sang Jung
2015-01-01
We present a reliable method for aligning an electron gun which consists of an electron source and lenses by controlling a stack of rubber O-rings in a vacuum condition. The beam direction angle is precisely tilted along two axes by adjusting the height difference of a stack of O-rings. In addition, the source position is shifted in each of three orthogonal directions. We show that the tilting angle and linear shift along the x and y axes as obtained from ten stacked O-rings are ±2.55° and ±2 mm, respectively. This study can easily be adapted to charged particle gun alignment and adjustments of the flange position in a vacuum, ensuring that its results can be useful with regard to electrical insulation between flanges with slight modifications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uzunyan, S. A.; Blazey, G.; Boi, S.
Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input formore » image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.« less
Irregular Applications: Architectures & Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feo, John T.; Villa, Oreste; Tumeo, Antonino
Irregular applications are characterized by irregular data structures, control and communication patterns. Novel irregular high performance applications which deal with large data sets and require have recently appeared. Unfortunately, current high performance systems and software infrastructures executes irregular algorithms poorly. Only coordinated efforts by end user, area specialists and computer scientists that consider both the architecture and the software stack may be able to provide solutions to the challenges of modern irregular applications.
Tunable terahertz radiation source
Boulaevskii, Lev; Feldmann, David M; Jia, Quanxi; Koshelev, Alexei; Moody, Nathan A
2014-01-21
Terahertz radiation source and method of producing terahertz radiation, said source comprising a junction stack, said junction stack comprising a crystalline material comprising a plurality of self-synchronized intrinsic Josephson junctions; an electrically conductive material in contact with two opposing sides of said crystalline material; and a substrate layer disposed upon at least a portion of both the crystalline material and the electrically-conductive material, wherein the crystalline material has a c-axis which is parallel to the substrate layer, and wherein the source emits at least 1 mW of power.
The NIH BD2K center for big data in translational genomics.
Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; Kent, W James; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van't Veer, Laura; Wold, Barbara; Haussler, David
2015-11-01
The world's genomics data will never be stored in a single repository - rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world's genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM's performance and utility. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Lighting system combining daylight concentrators and an artificial source
Bornstein, Jonathan G.; Friedman, Peter S.
1985-01-01
A combined lighting system for a building interior includes a stack of luminescent solar concentrators (LSC), an optical conduit made of preferably optical fibers for transmitting daylight from the LSC stack, a collimating lens set at an angle, a fixture for receiving the daylight at one end and for distributing the daylight as illumination inside the building, an artificial light source at the other end of the fixture for directing artifical light into the fixture for distribution as illumination inside the building, an automatic dimmer/brightener for the artificial light source, and a daylight sensor positioned near to the LSC stack for controlling the automatic dimmer/brightener in response to the daylight sensed. The system also has a reflector positioned behind the artificial light source and a fan for exhausting heated air out of the fixture during summer and for forcing heated air into the fixture for passage into the building interior during winter.
USDA-ARS?s Scientific Manuscript database
Particle size distributions (PSD) have long been used to more accurately estimate the PM10 fraction of total particulate matter (PM) stack samples taken from agricultural sources. These PSD analyses were typically conducted using a Coulter Counter with 50 micrometer aperture tube. With recent increa...
Design of a Community-Engaged Health Informatics Platform with an Architecture of Participation.
Millery, Mari; Ramos, Wilson; Lien, Chueh; Aguirre, Alejandra N; Kukafka, Rita
2015-01-01
Community-engaged health informatics (CEHI) applies information technology and participatory approaches to improve the health of communities. Our objective was to translate the concept of CEHI into a usable and replicable informatics platform that will facilitate community-engaged practice and research. The setting is a diverse urban neighborhood in New York City. The methods included community asset mapping, stakeholder interviews, logic modeling, analysis of affordances in open-source tools, elicitation of use cases and requirements, and a survey of early adopters. Based on synthesis of data collected, GetHealthyHeigths.org (GHH) was developed using open-source LAMP stack and Drupal content management software. Drupal's organic groups module was used for novel participatory functionality, along with detailed user roles and permissions. Future work includes evaluation of GHH and its impact on agency and service networks. We plan to expand GHH with additional functionality to further support CEHI by combining informatics solutions with community engagement to improve health.
Design of a Community-Engaged Health Informatics Platform with an Architecture of Participation
Millery, Mari; Ramos, Wilson; Lien, Chueh; Aguirre, Alejandra N.; Kukafka, Rita
2015-01-01
Community-engaged health informatics (CEHI) applies information technology and participatory approaches to improve the health of communities. Our objective was to translate the concept of CEHI into a usable and replicable informatics platform that will facilitate community-engaged practice and research. The setting is a diverse urban neighborhood in New York City. The methods included community asset mapping, stakeholder interviews, logic modeling, analysis of affordances in open-source tools, elicitation of use cases and requirements, and a survey of early adopters. Based on synthesis of data collected, GetHealthyHeigths.org (GHH) was developed using open-source LAMP stack and Drupal content management software. Drupal’s organic groups module was used for novel participatory functionality, along with detailed user roles and permissions. Future work includes evaluation of GHH and its impact on agency and service networks. We plan to expand GHH with additional functionality to further support CEHI by combining informatics solutions with community engagement to improve health. PMID:26958227
Cluster-lensing: A Python Package for Galaxy Clusters and Miscentering
NASA Astrophysics Data System (ADS)
Ford, Jes; VanderPlas, Jake
2016-12-01
We describe a new open source package for calculating properties of galaxy clusters, including Navarro, Frenk, and White halo profiles with and without the effects of cluster miscentering. This pure-Python package, cluster-lensing, provides well-documented and easy-to-use classes and functions for calculating cluster scaling relations, including mass-richness and mass-concentration relations from the literature, as well as the surface mass density {{Σ }}(R) and differential surface mass density {{Δ }}{{Σ }}(R) profiles, probed by weak lensing magnification and shear. Galaxy cluster miscentering is especially a concern for stacked weak lensing shear studies of galaxy clusters, where offsets between the assumed and the true underlying matter distribution can lead to a significant bias in the mass estimates if not accounted for. This software has been developed and released in a public GitHub repository, and is licensed under the permissive MIT license. The cluster-lensing package is archived on Zenodo. Full documentation, source code, and installation instructions are available at http://jesford.github.io/cluster-lensing/.
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Cloud Computing Environments
NASA Astrophysics Data System (ADS)
Li, C.; Wang, J.; Cui, C.; He, B.; Fan, D.; Yang, Y.; Chen, J.; Zhang, H.; Yu, C.; Xiao, J.; Wang, C.; Cao, Z.; Fan, Y.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Yin, S.
2015-09-01
AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on CloudStack, an open source software, we set up the cloud computing environment for AstroCloud Project. It consists of five distributed nodes across the mainland of China. Users can use and analysis data in this cloud computing environment. Based on GlusterFS, we built a scalable cloud storage system. Each user has a private space, which can be shared among different virtual machines and desktop systems. With this environments, astronomer can access to astronomical data collected by different telescopes and data centers easily, and data producers can archive their datasets safely.
ConfocalGN: A minimalistic confocal image generator
NASA Astrophysics Data System (ADS)
Dmitrieff, Serge; Nédélec, François
Validating image analysis pipelines and training machine-learning segmentation algorithms require images with known features. Synthetic images can be used for this purpose, with the advantage that large reference sets can be produced easily. It is however essential to obtain images that are as realistic as possible in terms of noise and resolution, which is challenging in the field of microscopy. We describe ConfocalGN, a user-friendly software that can generate synthetic microscopy stacks from a ground truth (i.e. the observed object) specified as a 3D bitmap or a list of fluorophore coordinates. This software can analyze a real microscope image stack to set the noise parameters and directly generate new images of the object with noise characteristics similar to that of the sample image. With a minimal input from the user and a modular architecture, ConfocalGN is easily integrated with existing image analysis solutions.
PyRAD: assembly of de novo RADseq loci for phylogenetic analyses.
Eaton, Deren A R
2014-07-01
Restriction-site-associated genomic markers are a powerful tool for investigating evolutionary questions at the population level, but are limited in their utility at deeper phylogenetic scales where fewer orthologous loci are typically recovered across disparate taxa. While this limitation stems in part from mutations to restriction recognition sites that disrupt data generation, an additional source of data loss comes from the failure to identify homology during bioinformatic analyses. Clustering methods that allow for lower similarity thresholds and the inclusion of indel variation will perform better at assembling RADseq loci at the phylogenetic scale. PyRAD is a pipeline to assemble de novo RADseq loci with the aim of optimizing coverage across phylogenetic datasets. It uses a wrapper around an alignment-clustering algorithm, which allows for indel variation within and between samples, as well as for incomplete overlap among reads (e.g. paired-end). Here I compare PyRAD with the program Stacks in their performance analyzing a simulated RADseq dataset that includes indel variation. Indels disrupt clustering of homologous loci in Stacks but not in PyRAD, such that the latter recovers more shared loci across disparate taxa. I show through reanalysis of an empirical RADseq dataset that indels are a common feature of such data, even at shallow phylogenetic scales. PyRAD uses parallel processing as well as an optional hierarchical clustering method, which allows it to rapidly assemble phylogenetic datasets with hundreds of sampled individuals. Software is written in Python and freely available at http://www.dereneaton.com/software/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Wavelet extractor: A Bayesian well-tie and wavelet extraction program
NASA Astrophysics Data System (ADS)
Gunning, James; Glinsky, Michael E.
2006-06-01
We introduce a new open-source toolkit for the well-tie or wavelet extraction problem of estimating seismic wavelets from seismic data, time-to-depth information, and well-log suites. The wavelet extraction model is formulated as a Bayesian inverse problem, and the software will simultaneously estimate wavelet coefficients, other parameters associated with uncertainty in the time-to-depth mapping, positioning errors in the seismic imaging, and useful amplitude-variation-with-offset (AVO) related parameters in multi-stack extractions. It is capable of multi-well, multi-stack extractions, and uses continuous seismic data-cube interpolation to cope with the problem of arbitrary well paths. Velocity constraints in the form of checkshot data, interpreted markers, and sonic logs are integrated in a natural way. The Bayesian formulation allows computation of full posterior uncertainties of the model parameters, and the important problem of the uncertain wavelet span is addressed uses a multi-model posterior developed from Bayesian model selection theory. The wavelet extraction tool is distributed as part of the Delivery seismic inversion toolkit. A simple log and seismic viewing tool is included in the distribution. The code is written in Java, and thus platform independent, but the Seismic Unix (SU) data model makes the inversion particularly suited to Unix/Linux environments. It is a natural companion piece of software to Delivery, having the capacity to produce maximum likelihood wavelet and noise estimates, but will also be of significant utility to practitioners wanting to produce wavelet estimates for other inversion codes or purposes. The generation of full parameter uncertainties is a crucial function for workers wishing to investigate questions of wavelet stability before proceeding to more advanced inversion studies.
Ocean acoustic interferometry.
Brooks, Laura A; Gerstoft, Peter
2007-06-01
Ocean acoustic interferometry refers to an approach whereby signals recorded from a line of sources are used to infer the Green's function between two receivers. An approximation of the time domain Green's function is obtained by summing, over all source positions (stacking), the cross-correlations between the receivers. Within this paper a stationary phase argument is used to describe the relationship between the stacked cross-correlations from a line of vertical sources, located in the same vertical plane as two receivers, and the Green's function between the receivers. Theory and simulations demonstrate the approach and are in agreement with those of a modal based approach presented by others. Results indicate that the stacked cross-correlations can be directly related to the shaded Green's function, so long as the modal continuum of any sediment layers is negligible.
Note: O-ring stack system for electron gun alignment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, In-Yong; Cho, Boklae; Han, Cheolsu
We present a reliable method for aligning an electron gun which consists of an electron source and lenses by controlling a stack of rubber O-rings in a vacuum condition. The beam direction angle is precisely tilted along two axes by adjusting the height difference of a stack of O-rings. In addition, the source position is shifted in each of three orthogonal directions. We show that the tilting angle and linear shift along the x and y axes as obtained from ten stacked O-rings are ±2.55° and ±2 mm, respectively. This study can easily be adapted to charged particle gun alignmentmore » and adjustments of the flange position in a vacuum, ensuring that its results can be useful with regard to electrical insulation between flanges with slight modifications.« less
Fuel cell system with sodium borohydride as hydrogen source for unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Kim, Kyunghwan; Kim, Taegyu; Lee, Kiseong; Kwon, Sejin
In this study, we design and fabricate a fuel cell system for application as a power source in unmanned aerial vehicles (UAVs). The fuel cell system consists of a fuel cell stack, hydrogen generator, and hybrid power management system. PEMFC stack with an output power of 100 W is prepared and tested to decide the efficient operating conditions; the stack must be operated in the dead-end mode with purge in order to ensure prolonged stack performance. A hydrogen generator is fabricated to supply gaseous hydrogen to the stack. Sodium borohydride (NaBH 4) is used as the hydrogen source in the present study. Co/Al 2O 3 catalyst is prepared for the hydrolysis of the alkaline NaBH 4 solution at room temperature. The fabricated Co catalyst is comparable to the Ru catalyst. The UAV consumes more power in the takeoff mode than in the cruising mode. A hybrid power management system using an auxiliary battery is developed and evaluated for efficient energy management. Hybrid power from both the fuel cell and battery powers takeoff and turning flight operations, while the fuel cell supplies steady power during the cruising flight. The capabilities of the fuel-cell UAVs for long endurance flights are validated by successful flight tests.
A double-correlation tremor-location method
NASA Astrophysics Data System (ADS)
Li, Ka Lok; Sgattoni, Giulia; Sadeghisorkhani, Hamzeh; Roberts, Roland; Gudmundsson, Olafur
2017-02-01
A double-correlation method is introduced to locate tremor sources based on stacks of complex, doubly-correlated tremor records of multiple triplets of seismographs back projected to hypothetical source locations in a geographic grid. Peaks in the resulting stack of moduli are inferred source locations. The stack of the moduli is a robust measure of energy radiated from a point source or point sources even when the velocity information is imprecise. Application to real data shows how double correlation focuses the source mapping compared to the common single correlation approach. Synthetic tests demonstrate the robustness of the method and its resolution limitations which are controlled by the station geometry, the finite frequency of the signal, the quality of the used velocity information and noise level. Both random noise and signal or noise correlated at time shifts that are inconsistent with the assumed velocity structure can be effectively suppressed. Assuming a surface wave velocity, we can constrain the source location even if the surface wave component does not dominate. The method can also in principle be used with body waves in 3-D, although this requires more data and seismographs placed near the source for depth resolution.
A Study of the Ethernet Troughput Performance of the Embedded System
NASA Astrophysics Data System (ADS)
Duan, Zhi-Yu; Zhao, Zhao-Wang
2007-09-01
An ethernet acceleration solution developed for the NIOS II Embedded System in astronomical applications - Mason Express is introduced in this paper. By manually constructing the proper network protocol headers and directly driving the hardware, Mason Express goes around the performance bottleneck of the Light Weighted IP stack (LWIP), and achieves up to 90Mb/s unidirectional data troughput rate from the embedded system board to the data collecting computer. With the LWIP stack, the maximum data rate is about 10.57Mb/s. Mason Express is a total software solution and no hardware changes required, neither does it affect the uCOS II operating system nor the LWIP stack, and can be implemented with or without any embedded operating system. It maximally protects the intelligence investment of the users.
Reconfigurable Wideband Circularly Polarized Stacked Square Patch Antenna for Cognitive Radios
NASA Technical Reports Server (NTRS)
Barbosa Kortright, Miguel A.; Waldstein, Seth W.; Simons, Rainee N.
2017-01-01
An almost square patch, a square patch and a stacked square patch with corner truncation for circular polarization (CP) are researched and developed at X-band for cognitive radios. Experimental results indicate, first, that the impedance bandwidth of a CP almost square patch fed from the edge by a 50 ohm line is 1.70% and second, that of a CP square patch fed from the ground plane side by a surface launch connector is 1.87%. Third, the impedance bandwidth of a CP stacked square patch fed by a surface launch connector is 2.22%. The measured center frequency for the CP square patch fed by a surface launch connector without and with an identical stacked patch is 8.45 and 8.1017 GHz, respectively. By stacking a patch, separated by a fixed air gap of 0.254 mm, the center frequency is observed to shift by as much as 348.3 MHz. The shift in center frequency, brought about by the reconfiguring of the physical layer antenna, can be exploited in a cognitive system since it expands the usable frequency spectrum for software reconfiguration in the presence of interference. In addition, varying the fixed air gap in the stacked antenna geometry by increments of 0.254 mm further expands the usable frequency spectrum.
A Manual Segmentation Tool for Three-Dimensional Neuron Datasets.
Magliaro, Chiara; Callara, Alejandro L; Vanello, Nicola; Ahluwalia, Arti
2017-01-01
To date, automated or semi-automated software and algorithms for segmentation of neurons from three-dimensional imaging datasets have had limited success. The gold standard for neural segmentation is considered to be the manual isolation performed by an expert. To facilitate the manual isolation of complex objects from image stacks, such as neurons in their native arrangement within the brain, a new Manual Segmentation Tool (ManSegTool) has been developed. ManSegTool allows user to load an image stack, scroll down the images and to manually draw the structures of interest stack-by-stack. Users can eliminate unwanted regions or split structures (i.e., branches from different neurons that are too close each other, but, to the experienced eye, clearly belong to a unique cell), to view the object in 3D and save the results obtained. The tool can be used for testing the performance of a single-neuron segmentation algorithm or to extract complex objects, where the available automated methods still fail. Here we describe the software's main features and then show an example of how ManSegTool can be used to segment neuron images acquired using a confocal microscope. In particular, expert neuroscientists were asked to segment different neurons from which morphometric variables were subsequently extracted as a benchmark for precision. In addition, a literature-defined index for evaluating the goodness of segmentation was used as a benchmark for accuracy. Neocortical layer axons from a DIADEM challenge dataset were also segmented with ManSegTool and compared with the manual "gold-standard" generated for the competition.
Open Technology Approaches to Geospatial Interface Design
NASA Astrophysics Data System (ADS)
Crevensten, B.; Simmons, D.; Alaska Satellite Facility
2011-12-01
What problems do you not want your software developers to be solving? Choosing open technologies across the entire stack of software development-from low-level shared libraries to high-level user interaction implementations-is a way to help ensure that customized software yields innovative and valuable tools for Earth Scientists. This demonstration will review developments in web application technologies and the recurring patterns of interaction design regarding exploration and discovery of geospatial data through the Vertex: ASF's Dataportal interface, a project utilizing current open web application standards and technologies including HTML5, jQueryUI, Backbone.js and the Jasmine unit testing framework.
SANSparallel: interactive homology search against Uniprot
Somervuo, Panu; Holm, Liisa
2015-01-01
Proteins evolve by mutations and natural selection. The network of sequence similarities is a rich source for mining homologous relationships that inform on protein structure and function. There are many servers available to browse the network of homology relationships but one has to wait up to a minute for results. The SANSparallel webserver provides protein sequence database searches with immediate response and professional alignment visualization by third-party software. The output is a list, pairwise alignment or stacked alignment of sequence-similar proteins from Uniprot, UniRef90/50, Swissprot or Protein Data Bank. The stacked alignments are viewed in Jalview or as sequence logos. The database search uses the suffix array neighborhood search (SANS) method, which has been re-implemented as a client-server, improved and parallelized. The method is extremely fast and as sensitive as BLAST above 50% sequence identity. Benchmarks show that the method is highly competitive compared to previously published fast database search programs: UBLAST, DIAMOND, LAST, LAMBDA, RAPSEARCH2 and BLAT. The web server can be accessed interactively or programmatically at http://ekhidna2.biocenter.helsinki.fi/cgi-bin/sans/sans.cgi. It can be used to make protein functional annotation pipelines more efficient, and it is useful in interactive exploration of the detailed evidence supporting the annotation of particular proteins of interest. PMID:25855811
NASA Astrophysics Data System (ADS)
Berk, Yuri; Karni, Yoram; Klumel, Genady; Openhaim, Yaakov; Cohen, Shalom; Yanson, Dan
2011-03-01
Advanced solid state laser architectures place increasingly demanding requirements on high-brightness, low-cost QCW laser diode pump sources, with custom apertures both for side and end rod pumping configurations. To meet this need, a new series of scaleable pump sources at 808nm and 940nm was developed. The stacks, available in multiple output formats, allow for custom aperture filling by varying both the length and quantity of stacked laser bars. For these products, we developed next-generation laser bars based on improved epitaxial wafer designs delivering power densities of 20W/mm of emission aperture. With >200W of peak QCW power available from a full-length 1cm bar, we have demonstrated power scaling to over 2kW in 10-bar stacks with 55% wall plug efficiency. We also present the design and performance of several stack configurations using full-length and reduced-length (mini) bars that demonstrate the versatility of both the bar and packaging designs. We illustrate how the ROBUST HEAD packaging technology developed at SCD is capable of accommodating variable bar length, pitch and quantity for custom rod pumping geometries. The excellent all-around performance of the stacks is supported by reliability data in line with the previously reported 20 Gshot space-grade qualification of SCD's stacks.
A software methodology for compiling quantum programs
NASA Astrophysics Data System (ADS)
Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias
2018-04-01
Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.
Enhanced Capabilities of BullReporter and BullConverter : final report.
DOT National Transportation Integrated Search
2017-09-01
Bull-Converter/Reporter is a software stack for Weigh-In-Motion (WIM) data analysis and reporting tools developed by the University of Minnesota Duluth for the Minnesota Department of Transportation (MnDOT) to resolve problems associated with deploym...
WE-D-204-06: An Open Source ImageJ CatPhan Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, G
2015-06-15
Purpose: The CatPhan is a popular QA device for assessing CT image quality. There are a number of software options which perform analysis of the CatPhan. However, there is often little ability for the user to adjust the analysis if it isn’t running properly, and these are all expensive options. An open source tool is an effective solution. Methods: To use the software, the user imports the CT as an image sequence in ImageJ. The user then scrolls to the slice with the lateral dots. The user then runs the plugin. If tolerance constraints are not already created, the usermore » is prompted to enter them or to use generic tolerances. Upon completion of the analysis, the plugin calls pdfLaTex to compile the pdf report. There is a csv version of the report as well. A log of the results from all CatPhan scans is kept as a csv file. The user can use this to baseline the machine. Results: The tool is capable of detecting the orientation of the phantom. If the CatPhan was scanned backwards, one can simply flip the stack of images horizontally and proceed with the analysis. The analysis includes Sensitometry (estimating the effective beam energy), HU values and linearity, Low Contrast Visibility (using LDPE & Polystyrene), Contrast Scale, Geometric Accuracy, Slice Thickness Accuracy, Spatial resolution (giving the MTF using the line pairs as well as the point spread function), CNR, Low Contrast Detectability (including the raw data), Uniformity (including the Cupping Effect). Conclusion: This is a robust tool that analyzes more components of the CatPhan than other software options (with the exception of ImageOwl). It produces an elegant pdf and keeps a log of analyses for long-term tracking of the system. Because it is open source, users are able to customize any component of it.« less
Transient deformational properties of high temperature alloys used in solid oxide fuel cell stacks
NASA Astrophysics Data System (ADS)
Molla, Tesfaye Tadesse; Kwok, Kawai; Frandsen, Henrik Lund
2017-05-01
Stresses and probability of failure during operation of solid oxide fuel cells (SOFCs) is affected by the deformational properties of the different components of the SOFC stack. Though the overall stress relaxes with time during steady state operation, large stresses would normally appear through transients in operation including temporary shut downs. These stresses are highly affected by the transient creep behavior of metallic components in the SOFC stack. This study investigates whether a variation of the so-called Chaboche's unified power law together with isotropic hardening can represent the transient behavior of Crofer 22 APU, a typical iron-chromium alloy used in SOFC stacks. The material parameters for the model are determined by measurements involving relaxation and constant strain rate experiments. The constitutive law is implemented into commercial finite element software using a user-defined material model. This is used to validate the developed constitutive law to experiments with constant strain rate, cyclic and creep experiments. The predictions from the developed model are found to agree well with experimental data. It is therefore concluded that Chaboche's unified power law can be applied to describe the high temperature inelastic deformational behaviors of Crofer 22 APU used for metallic interconnects in SOFC stacks.
NASA Astrophysics Data System (ADS)
Huang, Min-Sheng; Zhu, Ya-Xin; Li, Zhen-Huan
2014-04-01
The influence of dislocation dissociation on the evolution of Frank—Read (F-R) sources is studied using a three-dimensional discrete dislocation dynamics simulation (3D-DDD). The classical Orowan nucleation stress and recently proposed Benzerga nucleation time models for F-R sources are improved. This work shows that it is necessary to introduce the dislocation dissociation scheme into 3D-DDD simulation, especially for simulations on micro-plasticity of small sized materials with low stacking fault energy.
Liu, Guorui; Cai, Zongwei; Zheng, Minghui; Jiang, Xiaoxu; Nie, Zhiqiang; Wang, Mei
2015-01-01
Identifying marker congeners of unintentionally produced polychlorinated naphthalenes (PCNs) from industrial thermal sources might be useful for predicting total PCN (∑2-8PCN) emissions by the determination of only indicator congeners. In this study, potential indicator congeners were identified based on the PCN data in 122 stack gas samples from over 60 plants involved in more than ten industrial thermal sources reported in our previous case studies. Linear regression analyses identified that the concentrations of CN27/30, CN52/60, and CN66/67 correlated significantly with ∑2-8PCN (R(2)=0.77, 0.80, and 0.58, respectively; n=122, p<0.05), which might be good candidates for indicator congeners. Equations describing relationships between indicators and ∑2-8PCN were established. The linear regression analyses involving 122 samples showed that the relationships between the indicator congeners and ∑2-8PCN were not significantly affected by factors such as industry types, raw materials used, or operating conditions. Hierarchical cluster analysis and similarity calculations for the 122 stack gas samples were adopted to group those samples and evaluating their similarity and difference based on the PCN homolog distributions from different industrial thermal sources. Generally, the fractions of less chlorinated homologs comprised of di-, tri-, and tetra-homologs were much higher than that of more chlorinated homologs for up to 111 stack gas samples contained in group 1 and 2, which indicating the dominance of lower chlorinated homologs in stack gas from industrial thermal sources. Copyright © 2014 Elsevier Ltd. All rights reserved.
SHARP pre-release v1.0 - Current Status and Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay S.; Rahaman, Ronald O.
The NEAMS Reactor Product Line effort aims to develop an integrated multiphysics simulation capability for the design and analysis of future generations of nuclear power plants. The Reactor Product Line code suite’s multi-resolution hierarchy is being designed to ultimately span the full range of length and time scales present in relevant reactor design and safety analyses, as well as scale from desktop to petaflop computing platforms. In this report, building on a several previous report issued in September 2014, we describe our continued efforts to integrate thermal/hydraulics, neutronics, and structural mechanics modeling codes to perform coupled analysis of a representativemore » fast sodium-cooled reactor core in preparation for a unified release of the toolkit. The work reported in the current document covers the software engineering aspects of managing the entire stack of components in the SHARP toolkit and the continuous integration efforts ongoing to prepare a release candidate for interested reactor analysis users. Here we report on the continued integration effort of PROTEUS/Nek5000 and Diablo into the NEAMS framework and the software processes that enable users to utilize the capabilities without losing scientific productivity. Due to the complexity of the individual modules and their necessary/optional dependency library chain, we focus on the configuration and build aspects for the SHARP toolkit, which includes capability to autodownload dependencies and configure/install with optimal flags in an architecture-aware fashion. Such complexity is untenable without strong software engineering processes such as source management, source control, change reviews, unit tests, integration tests and continuous test suites. Details on these processes are provided in the report as a building step for a SHARP user guide that will accompany the first release, expected by Mar 2016.« less
Commercialisation of Solid Oxide Fuel Cells - opportunities and forecasts
NASA Astrophysics Data System (ADS)
Dziurdzia, B.; Magonski, Z.; Jankowski, H.
2016-01-01
The paper presents the analysis of commercialisation possibilities of the SOFC stack designed at AGH. The paper reminds the final design of the stack, presented earlier at IMAPS- Poland conferences, its recent modifications and measurements. The stack consists of planar double-sided ceramic fuel cells which characterize by the special anode construction with embedded fuel channels. The stack features by a simple construction without metallic interconnectors and frames, lowered thermal capacity and quick start-up time. Predictions for the possible applications of the stack include portable generators for luxurious caravans, yachts, ships at berth. The SOFC stack operating as clean, quiet and efficient power source could replace on-board diesel generators. Market forecasts shows that there is also some room on a market for the SOFC stack as a standalone generator in rural areas far away from the grid. The paper presents also the survey of SOFC market in Europe USA, Australia and other countries.
A Self-Provisioning Mechanism in OpenStack for IoT Devices.
Solano, Antonio; Dormido, Raquel; Duro, Natividad; Sánchez, Juan Miguel
2016-08-17
The aim of this paper is to introduce a plug-and-play mechanism for an Internet of Things (IoT) device to instantiate a Software as a Service (SaaS) application in a private cloud, built up with OpenStack. The SaaS application is the digital avatar of a physical object connected to Internet. As a proof of concept, a Vending Machine is retrofitted and connected to Internet with and Arduino Open Hardware device. Once the self-configuration mechanism is completed, it is possible to order a product from a mobile communication device.
A Self-Provisioning Mechanism in OpenStack for IoT Devices
Solano, Antonio; Dormido, Raquel; Duro, Natividad; Sánchez, Juan Miguel
2016-01-01
The aim of this paper is to introduce a plug-and-play mechanism for an Internet of Things (IoT) device to instantiate a Software as a Service (SaaS) application in a private cloud, built up with OpenStack. The SaaS application is the digital avatar of a physical object connected to Internet. As a proof of concept, a Vending Machine is retrofitted and connected to Internet with and Arduino Open Hardware device. Once the self-configuration mechanism is completed, it is possible to order a product from a mobile communication device. PMID:27548166
[Analysis on Mechanism of Rainout Carried by Wet Stack of Thermal Power Plant].
Ouyang, Li-hua; Zhuang, Ye; Liu, Ke-wei; Chen, Zhen-yu; Gu, Peng
2015-06-01
Rainout from wet-stack took placed in many thermal power plants with WFGD system. Research on causes of the rainout is important to solve the problem. The objective of this research is to analyze the mechanism of rainout. Field study was performed to collect experimental data in one thermal power plant, including the amount of desulfurization slurry carried by wet flue gas, liquor condensate from wet duct, and droplets from the wet stack. Source apportionment analysis was carried out based on physical and chemical data of liquid sample and solid sample. The result showed that mist eliminator operated well, which met the performance guarantee value. But the total amount of desulfurization slurry in flue gas and the sulfate concentration in liquid condensate discharge from the wet duct/stack increased. The liquid condensate accumulated in the wet duct/stack led to liquid re-entrainment. In conclusion, the rainout in this power plant was caused by the short of wet ductwork or liquid discharge system, the droplets caused by re-entrainment carried by the saturated gas released from the stack. The main undissolved components of the rainout were composite carbonate and aluminosilicate. Although ash concentration in this WFGD met the regulation criteria, source apportionment analysis showed that fly ash contributed to rainout was accounted for 60%. This percentage value was same as the data of solid particles in the condensate. It is important to optimize the wet ductwork, wet stack liner, liquid collectors and drainage. Avoiding the accumulation from saturated vapor thermal condensation is an effective way to solve the wet stack rainout.
Development of an automatic subsea blowout preventer stack control system using PLC based SCADA.
Cai, Baoping; Liu, Yonghong; Liu, Zengkai; Wang, Fei; Tian, Xiaojie; Zhang, Yanzhen
2012-01-01
An extremely reliable remote control system for subsea blowout preventer stack is developed based on the off-the-shelf triple modular redundancy system. To meet a high reliability requirement, various redundancy techniques such as controller redundancy, bus redundancy and network redundancy are used to design the system hardware architecture. The control logic, human-machine interface graphical design and redundant databases are developed by using the off-the-shelf software. A series of experiments were performed in laboratory to test the subsea blowout preventer stack control system. The results showed that the tested subsea blowout preventer functions could be executed successfully. For the faults of programmable logic controllers, discrete input groups and analog input groups, the control system could give correct alarms in the human-machine interface. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ballinger, Marcel Y.; Larson, Timothy V.
2014-12-01
Research and development (R&D) facility emissions are difficult to characterize due to their variable processes, changing nature of research, and large number of chemicals. Positive matrix factorization (PMF) was applied to volatile organic compound (VOC) concentrations measured in the main exhaust stacks of four different R&D buildings to identify the number and composition of major contributing sources. PMF identified between 9 and 11 source-related factors contributing to stack emissions, depending on the building. Similar factors between buildings were major contributors to trichloroethylene (TCE), acetone, and ethanol emissions; other factors had similar profiles for two or more buildings but not all four. At least one factor for each building was identified that contained a broad mix of many species and constraints were used in PMF to modify the factors to resemble more closely the off-shift concentration profiles. PMF accepted the constraints with little decrease in model fit.
Neves, A A; Silva, E J; Roter, J M; Belladona, F G; Alves, H D; Lopes, R T; Paciornik, S; De-Deus, G A
2015-11-01
To propose an automated image processing routine based on free software to quantify root canal preparation outcomes in pairs of sound and instrumented roots after micro-CT scanning procedures. Seven mesial roots of human mandibular molars with different canal configuration systems were studied: (i) Vertucci's type 1, (ii) Vertucci's type 2, (iii) two individual canals, (iv) Vertucci's type 6, canals (v) with and (vi) without debris, and (vii) canal with visible pulp calcification. All teeth were instrumented with the BioRaCe system and scanned in a Skyscan 1173 micro-CT before and after canal preparation. After reconstruction, the instrumented stack of images (IS) was registered against the preoperative sound stack of images (SS). Image processing included contrast equalization and noise filtering. Sound canal volumes were obtained by a minimum threshold. For the IS, a fixed conservative threshold was chosen as the best compromise between instrumented canal and dentine whilst avoiding debris, resulting in instrumented canal plus empty spaces. Arithmetic and logical operations between sound and instrumented stacks were used to identify debris. Noninstrumented dentine was calculated using a minimum threshold in the IS and subtracting from the SS and total debris. Removed dentine volume was obtained by subtracting SS from IS. Quantitative data on total debris present in the root canal space after instrumentation, noninstrumented areas and removed dentine volume were obtained for each test case, as well as three-dimensional volume renderings. After standardization of acquisition, reconstruction and image processing micro-CT images, a quantitative approach for calculation of root canal biomechanical outcomes was achieved using free software. © 2014 International Endodontic Journal. Published by John Wiley & Sons Ltd.
The near-source impacts of diesel backup generators in urban environments
NASA Astrophysics Data System (ADS)
Tong, Zheming; Zhang, K. Max
2015-05-01
Distributed power generation, located close to consumers, plays an important role in the current and future power systems. However, its near-source impacts in complex urban environments are not well understood. In this paper, we focused on diesel backup generators that participate in demand response (DR) programs. We first improved the micro-environmental air quality simulations by employing a meteorology processor, AERMET, to generate site-specific boundary layer parameters for the Large Eddy Simulation (LES) modeling. The modeling structure was then incorporated into the CTAG model to evaluate the environmental impacts of diesel backup generators in near-source microenvironments. We found that the presence of either tall upwind or downwind building can deteriorate the air quality in the near-stack street canyons, largely due to the recirculation zones generated by the tall buildings, reducing the near-stack dispersion. Decreasing exhaust momentum ratio (stack exit velocity/ambient wind velocity) draws more exhaust into the recirculation zone, and reduces the effective stack height, which results in elevated near-ground concentrations inside downwind street canyons. The near-ground PM2.5 concentration for the worst scenarios could well exceed 100 μg m-3, posing potential health risk to people living and working nearby. In general, older diesel backup generators (i.e., Tier 1, 2 or older) without the up-to-date emission control may significantly increase the pollutant concentration in the near-source street canyons if participating in DR programs. Even generators that comply with Tier-4 standards could lead to PM hotspots if their stacks are next to tall buildings. Our study implies that the siting of diesel backup generators stacks should consider not only the interactions of fresh air intake and exhaust outlet for the building housing the backup generators, but also the dispersion of exhaust plumes in the surrounding environment.
NASA Astrophysics Data System (ADS)
Chong, Jihyo; Kim, Young J.; Baek, Jongho; Lee, Hanlim
2016-10-01
Major anthropogenic sources of sulphur dioxide in the troposphere include point sources such as power plants and combustion-derived industrial sources. Spatially resolved remote sensing of atmospheric trace gases is desirable for better estimation and validation of emission from those sources. It has been reported that Imaging Differential Optical Absorption Spectroscopy (I-DOAS) technique can provide the spatially resolved two-dimensional distribution measurement of atmospheric trace gases. This study presents the results of I-DOAS observations of SO2 from a large power plant. The stack plume from the Taean coal-fired power plant was remotely sensed with an I-DOAS instrument. The slant column density (SCD) of SO2 was derived by data analysis of the absorption spectra of the scattered sunlight measured by an I-DOAS over the power plant stacks. Two-dimensional distribution of SO2 SCD was obtained over the viewing window of the I-DOAS instrument. The measured SCDs were converted to mixing ratios in order to estimate the rate of SO2 emission from each stack. The maximum mixing ratio of SO2 was measured to be 28.1 ppm with a SCD value of 4.15×1017 molecules/cm2. Based on the exit velocity of the plume from the stack, the emission rate of SO2 was estimated to be 22.54 g/s. Remote sensing of SO2 with an I-DOAS instrument can be very useful for independent estimation and validation of the emission rates from major point sources as well as area sources.
CrossTalk. The Journal of Defense Software Engineering. Volume 16, Number 11, November 2003
2003-11-01
memory area, and stack pointer. These systems are classified as preemptive or nonpreemptive depending on whether they can preempt an existing task or not...of charge. The Software Technology Support Center was established at Ogden Air Logistics Center (AFMC) by Headquarters U.S. Air Force to help Air...device. A script file could be a list of commands for a command interpreter such as a batch file [15]. A communications port consists of a queue to hold
NASA Astrophysics Data System (ADS)
Shamugam, Veeramani; Murray, I.; Leong, J. A.; Sidhu, Amandeep S.
2016-03-01
Cloud computing provides services on demand instantly, such as access to network infrastructure consisting of computing hardware, operating systems, network storage, database and applications. Network usage and demands are growing at a very fast rate and to meet the current requirements, there is a need for automatic infrastructure scaling. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Managing complex environments using traditional networks is time-consuming and expensive, especially in the case of generating virtual machines, migration and network configuration. To mitigate the challenges, network operations require efficient, flexible, agile and scalable software defined networks (SDN). This paper discuss various issues in SDN and suggests how to mitigate the network management related issues. A private cloud prototype test bed was setup to implement the SDN on the OpenStack platform to test and evaluate the various network performances provided by the various configurations.
LHCb Build and Deployment Infrastructure for run 2
NASA Astrophysics Data System (ADS)
Clemencic, M.; Couturier, B.
2015-12-01
After the successful run 1 of the LHC, the LHCb Core software team has taken advantage of the long shutdown to consolidate and improve its build and deployment infrastructure. Several of the related projects have already been presented like the build system using Jenkins, as well as the LHCb Performance and Regression testing infrastructure. Some components are completely new, like the Software Configuration Database (using the Graph DB Neo4j), or the new packaging installation using RPM packages. Furthermore all those parts are integrated to allow easier and quicker releases of the LHCb Software stack, therefore reducing the risk of operational errors. Integration and Regression tests are also now easier to implement, allowing to improve further the software checks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borm, B.; Gärtner, F.; Khaghani, D.
2016-09-15
We demonstrate that stacking several imaging plates (IPs) constitutes an easy method to increase hard x-ray detection efficiency. Used to record x-ray radiographic images produced by an intense-laser driven hard x-ray backlighter source, the IP stacks resulted in a significant improvement of the radiograph density resolution. We attribute this to the higher quantum efficiency of the combined detectors, leading to a reduced photon noise. Electron-photon transport simulations of the interaction processes in the detector reproduce the observed contrast improvement. Increasing the detection efficiency to enhance radiographic imaging capabilities is equally effective as increasing the x-ray source yield, e.g., by amore » larger drive laser energy.« less
Dynamic provisioning of local and remote compute resources with OpenStack
NASA Astrophysics Data System (ADS)
Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.
2015-12-01
Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.
Matoza, Robin S.; Chouet, Bernard A.; Dawson, Phillip B.; Shearer, Peter M.; Haney, Matthew M.; Waite, Gregory P.; Moran, Seth C.; Mikesell, T. Dylan
2015-01-01
Long-period (LP, 0.5-5 Hz) seismicity, observed at volcanoes worldwide, is a recognized signature of unrest and eruption. Cyclic LP “drumbeating” was the characteristic seismicity accompanying the sustained dome-building phase of the 2004–2008 eruption of Mount St. Helens (MSH), WA. However, together with the LP drumbeating was a near-continuous, randomly occurring series of tiny LP seismic events (LP “subevents”), which may hold important additional information on the mechanism of seismogenesis at restless volcanoes. We employ template matching, phase-weighted stacking, and full-waveform inversion to image the source mechanism of one multiplet of these LP subevents at MSH in July 2005. The signal-to-noise ratios of the individual events are too low to produce reliable waveform-inversion results, but the events are repetitive and can be stacked. We apply network-based template matching to 8 days of continuous velocity waveform data from 29 June to 7 July 2005 using a master event to detect 822 network triggers. We stack waveforms for 359 high-quality triggers at each station and component, using a combination of linear and phase-weighted stacking to produce clean stacks for use in waveform inversion. The derived source mechanism pointsto the volumetric oscillation (~10 m3) of a subhorizontal crack located at shallow depth (~30 m) in an area to the south of Crater Glacier in the southern portion of the breached MSH crater. A possible excitation mechanism is the sudden condensation of metastable steam from a shallow pressurized hydrothermal system as it encounters cool meteoric water in the outer parts of the edifice, perhaps supplied from snow melt.
The Pan-STARRS1 Small Area Survey 2
NASA Astrophysics Data System (ADS)
Metcalfe, N.; Farrow, D. J.; Cole, S.; Draper, P. W.; Norberg, P.; Burgett, W. S.; Chambers, K. C.; Denneau, L.; Flewelling, H.; Kaiser, N.; Kudritzki, R.; Magnier, E. A.; Morgan, J. S.; Price, P. A.; Sweeney, W.; Tonry, J. L.; Wainscoat, R. J.; Waters, C.
2013-11-01
The Panoramic Survey Telescope and Rapid Response System 1 (Pan-STARRS1) survey is acquiring multi-epoch imaging in five bands (gP1, rP1, iP1, zP1, yP1) over the entire sky north of declination -30° (the 3π survey). In 2011 July a test area of about 70 deg2 was observed to the expected final depth of the main survey. In this, the first of a series of papers targeting the galaxy count and clustering properties of the combined multi-epoch test area data, we present a detailed investigation into the depth of the survey and the reliability of the Pan-STARRS1 analysis software. We show that the Pan-STARRS1 reduction software can recover the properties of fake sources, and show good agreement between the magnitudes measured by Pan-STARRS1 and those from Sloan Digital Sky Survey. We also examine the number of false detections apparent in the Pan-STARRS1 data. Our comparisons show that the test area survey is somewhat deeper than the Sloan Digital Sky Survey in all bands, and, in particular, the z band approaches the depth of the stacked Sloan Stripe 82 data.
Managing a tier-2 computer centre with a private cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara
2014-06-01
In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.
Magnetotomography—a new method for analysing fuel cell performance and quality
NASA Astrophysics Data System (ADS)
Hauer, Karl-Heinz; Potthast, Roland; Wüster, Thorsten; Stolten, Detlef
Magnetotomography is a new method for the measurement and analysis of the current density distribution of fuel cells. The method is based on the measurement of the magnetic flux surrounding the fuel cell stack caused by the current inside the stack. As it is non-invasive, magnetotomography overcomes the shortcomings of traditional methods for the determination of current density in fuel cells [J. Stumper, S.A. Campell, D.P. Wilkinson, M.C. Johnson, M. Davis, In situ methods for the determination of current distributions in PEM fuel cells, Electrochem. Acta 43 (1998) 3773; S.J.C. Cleghorn, C.R. Derouin, M.S. Wilson, S. Gottesfeld, A printed circuit board approach to measuring current distribution in a fuel cell, J. Appl. Electrochem. 28 (1998) 663; Ch. Wieser, A. Helmbold, E. Gülzow, A new technique for two-dimensional current distribution measurements in electro-chemical cells, J. Appl. Electrochem. 30 (2000) 803; Grinzinger, Methoden zur Ortsaufgelösten Strommessung in Polymer Elektrolyt Brennstoffzellen, Diploma thesis, TU-München, 2003; Y.-G. Yoon, W.-Y. Lee, T.-H. Yang, G.-G. Park, C.-S. Kim, Current distribution in a single cell of PEMFC, J. Power Sources 118 (2003) 193-199; M.M. Mench, C.Y. Wang, An in situ method for determination of current distribution in PEM fuel cells applied to a direct methanol fuel cell, J. Electrochem. Soc. 150 (2003) A79-A85; S. Schönbauer, T. Kaz, H. Sander, E. Gülzow, Segmented bipolar plate for the determination of current distribution in polymer electrolyte fuel cells, in: Proceedings of the Second European PEMFC Forum, vol. 1, Lucerne/Switzerland, 2003, pp. 231-237; G. Bender, S.W. Mahlon, T.A. Zawodzinski, Further refinements in the segmented cell approach to diagnosing performance in polymer electrolyte fuel cells, J. Power Sources 123 (2003) 163-171]. After several years of research a complete prototype system is now available for research on single cells and stacks. This paper describes the basic system (fundamentals, hardware and software) as well as the state of development until December 2003. Initial findings on a full-size single cell will be presented together with an outlook on the planned next steps.
Retrieval of Body-Wave Reflections Using Ambient Noise Interferometry Using a Small-Scale Experiment
NASA Astrophysics Data System (ADS)
Dantas, Odmaksuel Anísio Bezerra; do Nascimento, Aderson Farias; Schimmel, Martin
2018-02-01
We report the retrieval of body-wave reflections from noise records using a small-scale experiment over a mature oil field. The reflections are obtained by cross-correlation and stacking of the data. We used the stacked correlograms to create virtual source-to-receiver common shot gathers and are able to obtain body-wave reflections. Surface waves that obliterate the body-waves in our noise correlations were attenuated following a standard procedure from active source seismics. Further different strategies were employed to cross-correlate and stack the data: classical geometrical normalized cross-correlation (CCGN), phase cross-correlation (PCC), linear stacking**** and phase weighted stacking (PWS). PCC and PWS are based on the instantaneous phase coherence of analytic signals. The four approaches are independent and reveal the reflections; nevertheless, the combination of PWS and CCGN provided the best results. Our analysis is based on 2145 cross-correlations of 600 s data segments. We also compare the resulted virtual shot gathers with an active 2D seismic line near the passive experiment. It is shown that our ambient noise analysis reproduces reflections which are present in the active seismic data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solano, M.; Chang, H.; VanDyke, J.
1996-12-31
This paper describes the implementation and results of portable, production-scale 3D Pre-stack Kirchhoff depth migration software. Full volume pre-stack imaging was applied to a six million trace (46.9 Gigabyte) data set from a subsalt play in the Garden Banks area in the Gulf of Mexico. The velocity model building and updating, were derived using image depth gathers and an image-driven strategy. After three velocity iterations, depth migrated sections revealed drilling targets that were not visible in the conventional 3D post-stack time migrated data set. As expected from the implementation of the migration algorithm, it was found that amplitudes are wellmore » preserved and anomalies associated with known reservoirs, conform to petrophysical predictions. Image gathers for velocity analysis and the final depth migrated volume, were generated on an 1824 node Intel Paragon at Sandia National Laboratories. The code has been successfully ported to a CRAY (T3D) and Unix workstation Parallel Virtual Machine environments (PVM).« less
[Porting Radiotherapy Software of Varian to Cloud Platform].
Zou, Lian; Zhang, Weisha; Liu, Xiangxiang; Xie, Zhao; Xie, Yaoqin
2017-09-30
To develop a low-cost private cloud platform of radiotherapy software. First, a private cloud platform which was based on OpenStack and the virtual GPU hardware was builded. Then on the private cloud platform, all the Varian radiotherapy software modules were installed to the virtual machine, and the corresponding function configuration was completed. Finally the software on the cloud was able to be accessed by virtual desktop client. The function test results of the cloud workstation show that a cloud workstation is equivalent to an isolated physical workstation, and any clients on the LAN can use the cloud workstation smoothly. The cloud platform transplantation in this study is economical and practical. The project not only improves the utilization rates of radiotherapy software, but also makes it possible that the cloud computing technology can expand its applications to the field of radiation oncology.
Dynamic provisioning of a HEP computing infrastructure on a shared hybrid HPC system
NASA Astrophysics Data System (ADS)
Meier, Konrad; Fleig, Georg; Hauth, Thomas; Janczyk, Michael; Quast, Günter; von Suchodoletz, Dirk; Wiebelt, Bernd
2016-10-01
Experiments in high-energy physics (HEP) rely on elaborate hardware, software and computing systems to sustain the high data rates necessary to study rare physics processes. The Institut fr Experimentelle Kernphysik (EKP) at KIT is a member of the CMS and Belle II experiments, located at the LHC and the Super-KEKB accelerators, respectively. These detectors share the requirement, that enormous amounts of measurement data must be processed and analyzed and a comparable amount of simulated events is required to compare experimental results with theoretical predictions. Classical HEP computing centers are dedicated sites which support multiple experiments and have the required software pre-installed. Nowadays, funding agencies encourage research groups to participate in shared HPC cluster models, where scientist from different domains use the same hardware to increase synergies. This shared usage proves to be challenging for HEP groups, due to their specialized software setup which includes a custom OS (often Scientific Linux), libraries and applications. To overcome this hurdle, the EKP and data center team of the University of Freiburg have developed a system to enable the HEP use case on a shared HPC cluster. To achieve this, an OpenStack-based virtualization layer is installed on top of a bare-metal cluster. While other user groups can run their batch jobs via the Moab workload manager directly on bare-metal, HEP users can request virtual machines with a specialized machine image which contains a dedicated operating system and software stack. In contrast to similar installations, in this hybrid setup, no static partitioning of the cluster into a physical and virtualized segment is required. As a unique feature, the placement of the virtual machine on the cluster nodes is scheduled by Moab and the job lifetime is coupled to the lifetime of the virtual machine. This allows for a seamless integration with the jobs sent by other user groups and honors the fairshare policies of the cluster. The developed thin integration layer between OpenStack and Moab can be adapted to other batch servers and virtualization systems, making the concept also applicable for other cluster operators. This contribution will report on the concept and implementation of an OpenStack-virtualized cluster used for HEP workflows. While the full cluster will be installed in spring 2016, a test-bed setup with 800 cores has been used to study the overall system performance and dedicated HEP jobs were run in a virtualized environment over many weeks. Furthermore, the dynamic integration of the virtualized worker nodes, depending on the workload at the institute's computing system, will be described.
NASA Astrophysics Data System (ADS)
Lutz, Yves; Poyet, Jean-Michel; Metzger, Nicolas
2013-10-01
Laser diode stacks are interesting laser sources for active imaging illuminators. They allow the accumulation of large amounts of energy in multi-pulse mode, which is well suited for long-range image recording. Even when laser diode stacks are equipped with fast-axis collimation (FAC) and slow-axis collimation (SAC) microlenses, their beam parameter product (BPP) are not compatible with a direct use in highly efficient and compact illuminators. This is particularly true when narrow divergences are required such as for long range applications. To overcome these difficulties, we conducted investigations in three different ways. A first near infrared illuminator based on the use of conductively cooled mini-bars was designed, realized and successfully tested during outdoor experimentations. This custom specified stack was then replaced in a second step by an off-the-shelf FAC + SAC micro lensed stack where the brightness was increased by polarization overlapping. The third method still based on a commercial laser diode stack uses a non imaging optical shaping principle resulting in a virtually restacked laser source with enhanced beam parameters. This low cost, efficient and low alignment sensitivity beam shaping method allows obtaining a compact and high performance laser diode illuminator for long range active imaging applications. The three methods are presented and compared in this paper.
Leske, David A; Hatt, Sarah R; Liebermann, Laura; Holmes, Jonathan M
2016-02-01
We compare two methods of analysis for Rasch scoring pre- to postintervention data: Rasch lookup table versus de novo stacked Rasch analysis using the Adult Strabismus-20 (AS-20). One hundred forty-seven subjects completed the AS-20 questionnaire prior to surgery and 6 weeks postoperatively. Subjects were classified 6 weeks postoperatively as "success," "partial success," or "failure" based on angle and diplopia status. Postoperative change in AS-20 scores was compared for all four AS-20 domains (self-perception, interactions, reading function, and general function) overall and by success status using two methods: (1) applying historical Rasch threshold measures from lookup tables and (2) performing a stacked de novo Rasch analysis. Change was assessed by analyzing effect size, improvement exceeding 95% limits of agreement (LOA), and score distributions. Effect sizes were similar for all AS-20 domains whether obtained from lookup tables or stacked analysis. Similar proportions exceeded 95% LOAs using lookup tables versus stacked analysis. Improvement in median score was observed for all AS-20 domains using lookup tables and stacked analysis ( P < 0.0001 for all comparisons). The Rasch-scored AS-20 is a responsive and valid instrument designed to measure strabismus-specific health-related quality of life. When analyzing pre- to postoperative change in AS-20 scores, Rasch lookup tables and de novo stacked Rasch analysis yield essentially the same results. We describe a practical application of lookup tables, allowing the clinician or researcher to score the Rasch-calibrated AS-20 questionnaire without specialized software.
Leske, David A.; Hatt, Sarah R.; Liebermann, Laura; Holmes, Jonathan M.
2016-01-01
Purpose We compare two methods of analysis for Rasch scoring pre- to postintervention data: Rasch lookup table versus de novo stacked Rasch analysis using the Adult Strabismus-20 (AS-20). Methods One hundred forty-seven subjects completed the AS-20 questionnaire prior to surgery and 6 weeks postoperatively. Subjects were classified 6 weeks postoperatively as “success,” “partial success,” or “failure” based on angle and diplopia status. Postoperative change in AS-20 scores was compared for all four AS-20 domains (self-perception, interactions, reading function, and general function) overall and by success status using two methods: (1) applying historical Rasch threshold measures from lookup tables and (2) performing a stacked de novo Rasch analysis. Change was assessed by analyzing effect size, improvement exceeding 95% limits of agreement (LOA), and score distributions. Results Effect sizes were similar for all AS-20 domains whether obtained from lookup tables or stacked analysis. Similar proportions exceeded 95% LOAs using lookup tables versus stacked analysis. Improvement in median score was observed for all AS-20 domains using lookup tables and stacked analysis (P < 0.0001 for all comparisons). Conclusions The Rasch-scored AS-20 is a responsive and valid instrument designed to measure strabismus-specific health-related quality of life. When analyzing pre- to postoperative change in AS-20 scores, Rasch lookup tables and de novo stacked Rasch analysis yield essentially the same results. Translational Relevance We describe a practical application of lookup tables, allowing the clinician or researcher to score the Rasch-calibrated AS-20 questionnaire without specialized software. PMID:26933524
Mertens, Jan E.J.; Roie, Martijn Van; Merckx, Jonas; Dekoninck, Wouter
2017-01-01
Abstract Digitization of specimen collections has become a key priority of many natural history museums. The camera systems built for this purpose are expensive, providing a barrier in institutes with limited funding, and therefore hampering progress. An assessment is made on whether a low cost compact camera with image stacking functionality can help expedite the digitization process in large museums or provide smaller institutes and amateur entomologists with the means to digitize their collections. Images of a professional setup were compared with the Olympus Stylus TG-4 Tough, a low-cost compact camera with internal focus stacking functions. Parameters considered include image quality, digitization speed, price, and ease-of-use. The compact camera’s image quality, although inferior to the professional setup, is exceptional considering its fourfold lower price point. Producing the image slices in the compact camera is a matter of seconds and when optimal image quality is less of a priority, the internal stacking function omits the need for dedicated stacking software altogether, further decreasing the cost and speeding up the process. In general, it is found that, aware of its limitations, this compact camera is capable of digitizing entomological collections with sufficient quality. As technology advances, more institutes and amateur entomologists will be able to easily and affordably catalogue their specimens. PMID:29134038
40 CFR 63.9882 - What parts of my plant does this subpart cover?
Code of Federal Regulations, 2010 CFR
2010-07-01
... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...
40 CFR 63.9882 - What parts of my plant does this subpart cover?
Code of Federal Regulations, 2012 CFR
2012-07-01
... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...
40 CFR 63.9882 - What parts of my plant does this subpart cover?
Code of Federal Regulations, 2014 CFR
2014-07-01
... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...
40 CFR 63.9882 - What parts of my plant does this subpart cover?
Code of Federal Regulations, 2011 CFR
2011-07-01
... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...
40 CFR 63.9882 - What parts of my plant does this subpart cover?
Code of Federal Regulations, 2013 CFR
2013-07-01
... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...
Electrical Generation for More-Electric Aircraft Using Solid Oxide Fuel Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whyatt, Greg A.; Chick, Lawrence A.
This report examines the potential for Solid-Oxide Fuel Cells (SOFC) to provide electrical generation on-board commercial aircraft. Unlike a turbine-based auxiliary power unit (APU) a solid oxide fuel cell power unit (SOFCPU) would be more efficient than using the main engine generators to generate electricity and would operate continuously during flight. The focus of this study is on more-electric aircraft which minimize bleed air extraction from the engines and instead use electrical power obtained from generators driven by the main engines to satisfy all major loads. The increased electrical generation increases the potential fuel savings obtainable through more efficient electricalmore » generation using a SOFCPU. However, the weight added to the aircraft by the SOFCPU impacts the main engine fuel consumption which reduces the potential fuel savings. To investigate these relationships the Boeing 7878 was used as a case study. The potential performance of the SOFCPU was determined by coupling flowsheet modeling using ChemCAD software with a stack performance algorithm. For a given stack operating condition (cell voltage, anode utilization, stack pressure, target cell exit temperature), ChemCAD software was used to determine the cathode air rate to provide stack thermal balance, the heat exchanger duties, the gross power output for a given fuel rate, the parasitic power for the anode recycle blower and net power obtained from (or required by) the compressor/expander. The SOFC is based on the Gen4 Delphi planar SOFC with assumed modifications to tailor it to this application. The size of the stack needed to satisfy the specified condition was assessed using an empirically-based algorithm. The algorithm predicts stack power density based on the pressure, inlet temperature, cell voltage and anode and cathode inlet flows and compositions. The algorithm was developed by enhancing a model for a well-established material set operating at atmospheric pressure to reflect the effect of elevated pressure and to represent the expected enhancement obtained using a promising cell material set which has been tested in button cells but not yet used to produce full-scale stacks. The predictions for the effect of pressure on stack performance were based on literature. As part of this study, additional data were obtained on button cells at elevated pressure to confirm the validity of the predictions. The impact of adding weight to the 787-8 fuel consumption was determined as a function of flight distance using a PianoX model. A conceptual design for a SOFC power system for the Boeing 787 is developed and the weight estimated. The results indicate that the power density of the stacks must increase by at least a factor of 2 to begin saving fuel on the 787 aircraft. However, the conceptual design of the power system may still be useful for other applications which are less weight sensitive.« less
Investigating the Application of Moving Target Defenses to Network Security
2013-08-01
developing an MTD testbed using OpenStack [14] to show that our MTD design can actually work. Building an MTD system in a cloud infrastructure will be...Information Intelli- gence Research. New York, USA: ACM, 2013. [14] Openstack , “ Openstack : The folsom release,” http://www.openstack.org/software
5th Annual Earth System Grid Federation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
The purpose of the Fifth Annual Earth System Grid Federation (ESGF) Face-to-Face (F2F) Conference was to present the most recent information on the state of ESGF’s software stack and to identify and address the data needs and gaps for the climate and weather communities that ESGF supports.
40 CFR 98.173 - Calculating GHG emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... associated requirements for Tier 4 in subpart C of this part (General Stationary Fuel Combustion Sources). (b... basis (% CO2). Q = Hourly stack gas volumetric flow rate (scfh). %H2O = Hourly moisture percentage in... vented through the same stack as any combustion unit or process equipment that reports CO2 emissions...
40 CFR 98.173 - Calculating GHG emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... associated requirements for Tier 4 in subpart C of this part (General Stationary Fuel Combustion Sources). (b..., dry basis (% CO2). Q = Hourly stack gas volumetric flow rate (scfh). %H2O = Hourly moisture percentage... reduction furnace are vented through the same stack as any combustion unit or process equipment that reports...
40 CFR 98.173 - Calculating GHG emissions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... associated requirements for Tier 4 in subpart C of this part (General Stationary Fuel Combustion Sources). (b... basis (% CO2). Q = Hourly stack gas volumetric flow rate (scfh). %H2O = Hourly moisture percentage in... vented through the same stack as any combustion unit or process equipment that reports CO2 emissions...
Near Source Modeling: Building Downwash and Roadside Barriers
Knowing the fate of effluent from an industrial stack is important for assessing its impact on human health. AERMOD is one of several Gaussian plume models containing algorithms to evaluate the effect of buildings on the movement of the effluent from a stack. The goal of this s...
40 CFR 63.11466 - What are the performance test requirements for new and existing sources?
Code of Federal Regulations, 2010 CFR
2010-07-01
... (Appendix A-1) to select sampling port locations and the number of traverse points in each stack or duct... of the stack gas. (iii) Method 3, 3A, or 3B (Appendix A-2) to determine the dry molecular weight of...
A New, Scalable and Low Cost Multi-Channel Monitoring System for Polymer Electrolyte Fuel Cells.
Calderón, Antonio José; González, Isaías; Calderón, Manuel; Segura, Francisca; Andújar, José Manuel
2016-03-09
In this work a new, scalable and low cost multi-channel monitoring system for Polymer Electrolyte Fuel Cells (PEFCs) has been designed, constructed and experimentally validated. This developed monitoring system performs non-intrusive voltage measurement of each individual cell of a PEFC stack and it is scalable, in the sense that it is capable to carry out measurements in stacks from 1 to 120 cells (from watts to kilowatts). The developed system comprises two main subsystems: hardware devoted to data acquisition (DAQ) and software devoted to real-time monitoring. The DAQ subsystem is based on the low-cost open-source platform Arduino and the real-time monitoring subsystem has been developed using the high-level graphical language NI LabVIEW. Such integration can be considered a novelty in scientific literature for PEFC monitoring systems. An original amplifying and multiplexing board has been designed to increase the Arduino input port availability. Data storage and real-time monitoring have been performed with an easy-to-use interface. Graphical and numerical visualization allows a continuous tracking of cell voltage. Scalability, flexibility, easy-to-use, versatility and low cost are the main features of the proposed approach. The system is described and experimental results are presented. These results demonstrate its suitability to monitor the voltage in a PEFC at cell level.
SANSparallel: interactive homology search against Uniprot.
Somervuo, Panu; Holm, Liisa
2015-07-01
Proteins evolve by mutations and natural selection. The network of sequence similarities is a rich source for mining homologous relationships that inform on protein structure and function. There are many servers available to browse the network of homology relationships but one has to wait up to a minute for results. The SANSparallel webserver provides protein sequence database searches with immediate response and professional alignment visualization by third-party software. The output is a list, pairwise alignment or stacked alignment of sequence-similar proteins from Uniprot, UniRef90/50, Swissprot or Protein Data Bank. The stacked alignments are viewed in Jalview or as sequence logos. The database search uses the suffix array neighborhood search (SANS) method, which has been re-implemented as a client-server, improved and parallelized. The method is extremely fast and as sensitive as BLAST above 50% sequence identity. Benchmarks show that the method is highly competitive compared to previously published fast database search programs: UBLAST, DIAMOND, LAST, LAMBDA, RAPSEARCH2 and BLAT. The web server can be accessed interactively or programmatically at http://ekhidna2.biocenter.helsinki.fi/cgi-bin/sans/sans.cgi. It can be used to make protein functional annotation pipelines more efficient, and it is useful in interactive exploration of the detailed evidence supporting the annotation of particular proteins of interest. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
A New, Scalable and Low Cost Multi-Channel Monitoring System for Polymer Electrolyte Fuel Cells
Calderón, Antonio José; González, Isaías; Calderón, Manuel; Segura, Francisca; Andújar, José Manuel
2016-01-01
In this work a new, scalable and low cost multi-channel monitoring system for Polymer Electrolyte Fuel Cells (PEFCs) has been designed, constructed and experimentally validated. This developed monitoring system performs non-intrusive voltage measurement of each individual cell of a PEFC stack and it is scalable, in the sense that it is capable to carry out measurements in stacks from 1 to 120 cells (from watts to kilowatts). The developed system comprises two main subsystems: hardware devoted to data acquisition (DAQ) and software devoted to real-time monitoring. The DAQ subsystem is based on the low-cost open-source platform Arduino and the real-time monitoring subsystem has been developed using the high-level graphical language NI LabVIEW. Such integration can be considered a novelty in scientific literature for PEFC monitoring systems. An original amplifying and multiplexing board has been designed to increase the Arduino input port availability. Data storage and real-time monitoring have been performed with an easy-to-use interface. Graphical and numerical visualization allows a continuous tracking of cell voltage. Scalability, flexibility, easy-to-use, versatility and low cost are the main features of the proposed approach. The system is described and experimental results are presented. These results demonstrate its suitability to monitor the voltage in a PEFC at cell level. PMID:27005630
Natural gas availability and ambient air quality in the Baton Rouge/New Orleans industrial complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fieler, E.R.; Harrison, D.P.
1978-02-26
Three scenarios were modeled for the Baton Rouge/New Orleans area for 1985: one assumes the substitution of residual oil (0.7% sulfur) for gas to decrease gas-burning stationary sources from 80 to 8% and the use of properly designed stacks for large emitters; the second makes identical gas supply assumptions but adds proper stack dispersion for medium as well as large emitters; and the third is based on 16% gas-burning stationary sources. The Climatological Dispersion Model was used to translate (1974) emission rates into ambient air concentrations. Growth rates for residential, commercial, and transportation sources, but not industry, were considered. Themore » results show that proper policies, which would require not only tall stacks for large oil burning units (and for intermediate units also in the areas of high industrial concentration), but also the careful location of new plants would permit continued industrial expansion without severe air pollution problems.« less
Joint Inversion of Source Location and Source Mechanism of Induced Microseismics
NASA Astrophysics Data System (ADS)
Liang, C.
2014-12-01
Seismic source mechanism is a useful property to indicate the source physics and stress and strain distribution in regional, local and micro scales. In this study we jointly invert source mechanisms and locations for microseismics induced in fluid fracturing treatment in the oil and gas industry. For the events that are big enough to see waveforms, there are quite a few techniques can be applied to invert the source mechanism including waveform inversion, first polarity inversion and many other methods and variants based on these methods. However, for events that are too small to identify in seismic traces such as the microseismics induced by the fluid fracturing in the Oil and Gas industry, a source scanning algorithms (SSA for short) with waveform stacking are usually applied. At the same time, a joint inversion of location and source mechanism are possible but at a cost of high computation budget. The algorithm is thereby called Source Location and Mechanism Scanning Algorithm, SLMSA for short. In this case, for given velocity structure, all possible combinations of source locations (X,Y and Z) and source mechanism (Strike, Dip and Rake) are used to compute travel-times and polarities of waveforms. Correcting Normal moveout times and polarities, and stacking all waveforms, the (X, Y, Z , strike, dip, rake) combination that gives the strongest stacking waveform is identified as the solution. To solve the problem of high computation problem, CPU-GPU programing is applied. Numerical datasets are used to test the algorithm. The SLMSA has also been applied to a fluid fracturing datasets and reveal several advantages against the location only method: (1) for shear sources, the source only program can hardly locate them because of the canceling out of positive and negative polarized traces, but the SLMSA method can successfully pick up those events; (2) microseismic locations alone may not be enough to indicate the directionality of micro-fractures. The statistics of source mechanisms can certainly provide more knowledges on the orientation of fractures; (3) in our practice, the joint inversion method almost always yield more events than the source only method and for those events that are also picked by the SSA method, the stacking power of SLMSA are always higher than the ones obtained in SSA.
GPR data processing computer software for the PC
Lucius, Jeffrey E.; Powers, Michael H.
2002-01-01
The computer software described in this report is designed for processing ground penetrating radar (GPR) data on Intel-compatible personal computers running the MS-DOS operating system or MS Windows 3.x/95/98/ME/2000. The earliest versions of these programs were written starting in 1990. At that time, commercially available GPR software did not meet the processing and display requirements of the USGS. Over the years, the programs were refined and new features and programs were added. The collection of computer programs presented here can perform all basic processing of GPR data, including velocity analysis and generation of CMP stacked sections and data volumes, as well as create publication quality data images.
X-ray obscured AGN in the GOODS-N
NASA Astrophysics Data System (ADS)
Georgantopoulos, I.; Akylas, A.; Rovilos, E.; Xilouris, M.
2010-07-01
We explore the X-ray properties of the Dust Obscured Galaxies (DOGs) i.e. sources with f24μ/fR>1000. This population has been proposed to contain a significan fraction of Compton-thick sources at high redshift. In particular we study the X-ray spectra of the 14 DOGS detected in the CDFN 2Ms exposure. Their stacked spectrum is fla with Γ = 1+/-0.1 very similar to the stacked spectrum of the undetected DOGs (Γ = 0.8+/-0.2). However, many of our X-ray detected DOGs present only moderate absorption with column densities 1022
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian W.; Hemmert, K. Scott; Underwood, Keith Douglas
Achieving the next three orders of magnitude performance increase to move from petascale to exascale computing will require a significant advancements in several fundamental areas. Recent studies have outlined many of the challenges in hardware and software that will be needed. In this paper, we examine these challenges with respect to high-performance networking. We describe the repercussions of anticipated changes to computing and networking hardware and discuss the impact that alternative parallel programming models will have on the network software stack. We also present some ideas on possible approaches that address some of these challenges.
Computer software configuration description, 241-AY and 241-AZ tank farm MICON automation system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winkelman, W.D.
This document describes the configuration process, choices and conventions used during the configuration activities, and issues involved in making changes to the configuration. Includes the master listings of the Tag definitions, which should be revised to authorize any changes. Revision 2 incorporates minor changes to ensure the document setpoints accurately reflect limits (including exhaust stack flow of 800 scfm) established in OSD-T-151-00019. The MICON DCS software controls and monitors the instrumentation and equipment associated with plant systems and processes.
NASA Astrophysics Data System (ADS)
Walton, James S.; Hodgson, Peter; Hallamasek, Karen; Palmer, Jake
2003-07-01
4DVideo is creating a general purpose capability for capturing and analyzing kinematic data from video sequences in near real-time. The core element of this capability is a software package designed for the PC platform. The software ("4DCapture") is designed to capture and manipulate customized AVI files that can contain a variety of synchronized data streams -- including audio, video, centroid locations -- and signals acquired from more traditional sources (such as accelerometers and strain gauges.) The code includes simultaneous capture or playback of multiple video streams, and linear editing of the images (together with the ancilliary data embedded in the files). Corresponding landmarks seen from two or more views are matched automatically, and photogrammetric algorithms permit multiple landmarks to be tracked in two- and three-dimensions -- with or without lens calibrations. Trajectory data can be processed within the main application or they can be exported to a spreadsheet where they can be processed or passed along to a more sophisticated, stand-alone, data analysis application. Previous attempts to develop such applications for high-speed imaging have been limited in their scope, or by the complexity of the application itself. 4DVideo has devised a friendly ("FlowStack") user interface that assists the end-user to capture and treat image sequences in a natural progression. 4DCapture employs the AVI 2.0 standard and DirectX technology which effectively eliminates the file size limitations found in older applications. In early tests, 4DVideo has streamed three RS-170 video sources to disk for more than an hour without loss of data. At this time, the software can acquire video sequences in three ways: (1) directly, from up to three hard-wired cameras supplying RS-170 (monochrome) signals; (2) directly, from a single camera or video recorder supplying an NTSC (color) signal; and (3) by importing existing video streams in the AVI 1.0 or AVI 2.0 formats. The latter is particularly useful for high-speed applications where the raw images are often captured and stored by the camera before being downloaded. Provision has been made to synchronize data acquired from any combination of these video sources using audio and visual "tags." Additional "front-ends," designed for digital cameras, are anticipated.
Cloud flexibility using DIRAC interware
NASA Astrophysics Data System (ADS)
Fernandez Albor, Víctor; Seco Miguelez, Marcos; Fernandez Pena, Tomas; Mendez Muñoz, Victor; Saborido Silva, Juan Jose; Graciani Diaz, Ricardo
2014-06-01
Communities of different locations are running their computing jobs on dedicated infrastructures without the need to worry about software, hardware or even the site where their programs are going to be executed. Nevertheless, this usually implies that they are restricted to use certain types or versions of an Operating System because either their software needs an definite version of a system library or a specific platform is required by the collaboration to which they belong. On this scenario, if a data center wants to service software to incompatible communities, it has to split its physical resources among those communities. This splitting will inevitably lead to an underuse of resources because the data centers are bound to have periods where one or more of its subclusters are idle. It is, in this situation, where Cloud Computing provides the flexibility and reduction in computational cost that data centers are searching for. This paper describes a set of realistic tests that we ran on one of such implementations. The test comprise software from three different HEP communities (Auger, LHCb and QCD phenomelogists) and the Parsec Benchmark Suite running on one or more of three Linux flavors (SL5, Ubuntu 10.04 and Fedora 13). The implemented infrastructure has, at the cloud level, CloudStack that manages the virtual machines (VM) and the hosts on which they run, and, at the user level, the DIRAC framework along with a VM extension that will submit, monitorize and keep track of the user jobs and also requests CloudStack to start or stop the necessary VM's. In this infrastructure, the community software is distributed via the CernVM-FS, which has been proven to be a reliable and scalable software distribution system. With the resulting infrastructure, users are allowed to send their jobs transparently to the Data Center. The main purpose of this system is the creation of flexible cluster, multiplatform with an scalable method for software distribution for several VOs. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine, which is transparent to the user.
40 CFR 98.173 - Calculating GHG emissions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... associated requirements for Tier 4 in subpart C of this part (General Stationary Fuel Combustion Sources). (b... basis (% CO2). Q = Hourly stack gas volumetric flow rate (scfh). %H2O = Hourly moisture percentage in... furnace are vented through the same stack as any combustion unit or process equipment that reports CO2...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kent Simmons, J.A.; Knap, A.H.
1991-04-01
The computer model Industrial Source Complex Short Term (ISCST) was used to study the stack emissions from a refuse incinerator proposed for the inland of Bermuda. The model predicts that the highest ground level pollutant concentrations will occur near Prospect, 800 m to 1,000 m due south of the stack. The authors installed a portable laboratory and instruments at Prospect to begin making air quality baseline measurements. By comparing the model's estimates of the incinerator contribution to the background levels measured at the site they predicted that stack emissions would not cause an increase in TSP or SO{sub 2}. Themore » incinerator will be a significant source of HCI to Bermuda air with ambient levels approaching air quality guidelines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan, Bo; Shibutani, Yoji, E-mail: sibutani@mech.eng.osaka-u.ac.jp; Zhang, Xu
2015-07-07
Recent research has explained that the steeply increasing yield strength in metals depends on decreasing sample size. In this work, we derive a statistical physical model of the yield strength of finite single-crystal micro-pillars that depends on single-ended dislocation pile-up inside the micro-pillars. We show that this size effect can be explained almost completely by considering the stochastic lengths of the dislocation source and the dislocation pile-up length in the single-crystal micro-pillars. The Hall–Petch-type relation holds even in a microscale single-crystal, which is characterized by its dislocation source lengths. Our quantitative conclusions suggest that the number of dislocation sources andmore » pile-ups are significant factors for the size effect. They also indicate that starvation of dislocation sources is another reason for the size effect. Moreover, we investigated the explicit relationship between the stacking fault energy and the dislocation “pile-up” effect inside the sample: materials with low stacking fault energy exhibit an obvious dislocation pile-up effect. Our proposed physical model predicts a sample strength that agrees well with experimental data, and our model can give a more precise prediction than the current single arm source model, especially for materials with low stacking fault energy.« less
Integration of XRootD into the cloud infrastructure for ALICE data analysis
NASA Astrophysics Data System (ADS)
Kompaniets, Mikhail; Shadura, Oksana; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey
2015-12-01
Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments. We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is based on the Puppet configuration management system. Ceph installation and configuration operations are structured and converted to Puppet manifests describing node configurations and integrated into Packstack. This solution can be easily deployed, maintained and used even in small groups with limited computing resources and small organizations, which usually have lack of IT support. The proposed infrastructure has been tested on two different clouds (SPbSU & BITP) and integrates successfully with the ALICE data analysis model.
A Tour of the Stacks--HyperCard for Libraries.
ERIC Educational Resources Information Center
Ertel, Monica; Oros, Jane
1989-01-01
Description of HyperCard, a software package that runs on Macintosh microcomputers, focuses on its use in the Apple Computer, Inc., Library as a user guide to the library. Examples of screen displays are given, and a list of resources is included to help use and understand HyperCard more completely. (LRW)
Generic Fortran Containers (GFC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liakh, Dmitry
2016-09-01
The Fortran language does not provide a standard library that implements generic containers, like linked lists, trees, dictionaries, etc. The GFC software provides an implementation of generic Fortran containers natively written in Fortran 2003/2008 language. The following containers are either already implemented or planned: Stack (done), Linked list (done), Tree (done), Dictionary (done), Queue (planned), Priority queue (planned).
Monte Carlo Simulation to Estimate Likelihood of Direct Lightning Strikes
NASA Technical Reports Server (NTRS)
Mata, Carlos; Medelius, Pedro
2008-01-01
A software tool has been designed to quantify the lightning exposure at launch sites of the stack at the pads under different configurations. In order to predict lightning strikes to generic structures, this model uses leaders whose origins (in the x-y plane) are obtained from a 2D random, normal distribution.
Traileka Glacier X-Stack. Final Scientific/Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borkar, Shekhar
2015-09-01
The XStack Traleika Glacier (XSTG) project was a three-year research award for exploring a revolutionary exascale-class machine software framework. The XSTG program, including Intel, UC San Diego, Pacific Northwest National Lab, UIUC, Rice University, Reservoir Labs, ET International, and U. Delaware, had major accomplishments, insights, and products resulting from this three-year effort.
The Emergence of Open-Source Software in China
ERIC Educational Resources Information Center
Pan, Guohua; Bonk, Curtis J.
2007-01-01
The open-source software movement is gaining increasing momentum in China. Of the limited numbers of open-source software in China, "Red Flag Linux" stands out most strikingly, commanding 30 percent share of Chinese software market. Unlike the spontaneity of open-source movement in North America, open-source software development in…
Integrating the Apache Big Data Stack with HPC for Big Data
NASA Astrophysics Data System (ADS)
Fox, G. C.; Qiu, J.; Jha, S.
2014-12-01
There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.
Real-Time Simulation of Ares I Launch Vehicle
NASA Technical Reports Server (NTRS)
Tobbe, Patrick; Matras, Alex; Wilson, Heath; Alday, Nathan; Walker, David; Betts, Kevin; Hughes, Ryan; Turbe, Michael
2009-01-01
The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory (SIL) at the Marshall Space Flight Center (MSFC). The primary purpose of the Ares SIL is to test the vehicle avionics hardware and software in a hardware-in-the-loop (HWIL) environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time software backbone to stimulate all required Ares components through high-fidelity simulation. ARTEMIS has been designed to take full advantage of the advances in underlying computational power now available to support HWIL testing. A modular real-time design relying on a fully distributed computing architecture has been achieved. Two fundamental requirements drove ARTEMIS to pursue the use of high-fidelity simulation models in a real-time environment. First, ARTEMIS must be used to test a man-rated integrated avionics hardware and software system, thus requiring a wide variety of nominal and off-nominal simulation capabilities to certify system robustness. The second driving requirement - derived from a nationwide review of current state-of-the-art HWIL facilities - was that preserving digital model fidelity significantly reduced overall vehicle lifecycle cost by reducing testing time for certification runs and increasing flight tempo through an expanded operational envelope. These two driving requirements necessitated the use of high-fidelity models throughout the ARTEMIS simulation. The nature of the Ares mission profile imposed a variety of additional requirements on the ARTEMIS simulation. The Ares I vehicle is composed of multiple elements, including the First Stage Solid Rocket Booster (SRB), the Upper Stage powered by the J- 2X engine, the Orion Crew Exploration Vehicle (CEV) which houses the crew, the Launch Abort System (LAS), and various secondary elements that separate from the vehicle. At launch, the integrated vehicle stack is composed of these stages, and throughout the mission, various elements separate from the integrated stack and tumble back towards the earth. ARTEMIS must be capable of simulating the integrated stack through the flight as well as propagating each individual element after separation. In addition, abort sequences can lead to other unique configurations of the integrated stack as the timing and sequence of the stage separations are altered.
Traleika Glacier X-Stack Extension Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fryman, Joshua
The XStack Extension Project continued along the direction of the XStack program in exploring the software tools and frameworks to support a task-based community runtime towards the goal of Exascale programming. The momentum built as part of the XStack project, with the development of the task-based Open Community Runtime (OCR) and related tools, was carried through during the XStack Extension with the focus areas of easing application development, improving performance and supporting more features. The infrastructure set up for a community-driven open-source development continued to be used towards these areas, with continued co-development of runtime and applications. A variety ofmore » OCR programming environments were studied, as described in Sections Revolutionary Programming Environments & Applications – to assist with application development on OCR, and we develop OCR Translator, a ROSE-based source-to-source compiler that parses high-level annotations in an MPI program to generate equivalent OCR code. Figure 2 compares the number of OCR objects needed to generate the 2D stencil workload using the translator, against manual approaches based on SPMD library or native coding. The rate of increase with the translator, with an increase in number of ranks, is consistent with other approaches. This is explored further in Section OCR Translator.« less
Optimization Design of Bipolar Plate Flow Field in PEM Stack
NASA Astrophysics Data System (ADS)
Wen, Ming; He, Kanghao; Li, Peilong; Yang, Lei; Deng, Li; Jiang, Fei; Yao, Yong
2017-12-01
A new design of bipolar plate flow field in proton exchange membrane (PEM) stack was presented to develop a high-performance transfer efficiency of the two-phase flow. Two different flow fields were studied by using numerical simulations and the performance of the flow fields was presented. the hydrodynamic properties include pressure gap between inlet and outlet, the Reynold’s number of the two types were compared based on the Navier-Stokes equations. Computer aided optimization software was implemented in the design of experiments of the preferable flow field. The design of experiments (DOE) for the favorable concept was carried out to study the hydrodynamic properties when changing the design parameters of the bipolar plate.
Coluccelli, Nicola
2010-08-01
Modeling a real laser diode stack based on Zemax ray tracing software that operates in a nonsequential mode is reported. The implementation of the model is presented together with the geometric and optical parameters to be adjusted to calibrate the model and to match the simulated intensity irradiance profiles with the experimental profiles. The calibration of the model is based on a near-field and a far-field measurement. The validation of the model has been accomplished by comparing the simulated and experimental transverse irradiance profiles at different positions along the caustic formed by a lens. Spot sizes and waist location are predicted with a maximum error below 6%.
NASA Technical Reports Server (NTRS)
Parikh, Paresh; Engelund, Walter; Armand, Sasan; Bittner, Robert
2004-01-01
A computational fluid dynamic (CFD) study is performed on the Hyper-X (X-43A) Launch Vehicle stack configuration in support of the aerodynamic database generation in the transonic to hypersonic flow regime. The main aim of the study is the evaluation of a CFD method that can be used to support aerodynamic database development for similar future configurations. The CFD method uses the NASA Langley Research Center developed TetrUSS software, which is based on tetrahedral, unstructured grids. The Navier-Stokes computational method is first evaluated against a set of wind tunnel test data to gain confidence in the code s application to hypersonic Mach number flows. The evaluation includes comparison of the longitudinal stability derivatives on the complete stack configuration (which includes the X-43A/Hyper-X Research Vehicle, the launch vehicle and an adapter connecting the two), detailed surface pressure distributions at selected locations on the stack body and component (rudder, elevons) forces and moments. The CFD method is further used to predict the stack aerodynamic performance at flow conditions where no experimental data is available as well as for component loads for mechanical design and aero-elastic analyses. An excellent match between the computed and the test data over a range of flow conditions provides a computational tool that may be used for future similar hypersonic configurations with confidence.
Exploring the Earth Using Deep Learning Techniques
NASA Astrophysics Data System (ADS)
Larraondo, P. R.; Evans, B. J. K.; Antony, J.
2016-12-01
Research using deep neural networks have significantly matured in recent times, and there is now a surge in interest to apply such methods to Earth systems science and the geosciences. When combined with Big Data, we believe there are opportunities for significantly transforming a number of areas relevant to researchers and policy makers. In particular, by using a combination of data from a range of satellite Earth observations as well as computer simulations from climate models and reanalysis, we can gain new insights into the information that is locked within the data. Global geospatial datasets describe a wide range of physical and chemical parameters, which are mostly available using regular grids covering large spatial and temporal extents. This makes them perfect candidates to apply deep learning methods. So far, these techniques have been successfully applied to image analysis through the use of convolutional neural networks. However, this is only one field of interest, and there is potential for many more use cases to be explored. The deep learning algorithms require fast access to large amounts of data in the form of tensors and make intensive use of CPU in order to train its models. The Australian National Computational Infrastructure (NCI) has recently augmented its Raijin 1.2 PFlop supercomputer with hardware accelerators. Together with NCI's 3000 core high performance OpenStack cloud, these computational systems have direct access to NCI's 10+ PBytes of datasets and associated Big Data software technologies (see http://geonetwork.nci.org.au/ and http://nci.org.au/systems-services/national-facility/nerdip/). To effectively use these computing infrastructures requires that both the data and software are organised in a way that readily supports the deep learning software ecosystem. Deep learning software, such as the open source TensorFlow library, has allowed us to demonstrate the possibility of generating geospatial models by combining information from our different data sources. This opens the door to an exciting new way of generating products and extracting features that have previously been labour intensive. In this paper, we will explore some of these geospatial use cases and share some of the lessons learned from this experience.
40 CFR 60.1300 - What test methods must I use to stack test?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false What test methods must I use to stack test? 60.1300 Section 60.1300 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Small Municipal Waste Combustion Units for...
40 CFR 60.1790 - What test methods must I use to stack test?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false What test methods must I use to stack test? 60.1790 Section 60.1790 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Small Municipal Waste...
40 CFR 60.1790 - What test methods must I use to stack test?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 7 2012-07-01 2012-07-01 false What test methods must I use to stack test? 60.1790 Section 60.1790 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Small Municipal Waste...
40 CFR 60.1300 - What test methods must I use to stack test?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 7 2012-07-01 2012-07-01 false What test methods must I use to stack test? 60.1300 Section 60.1300 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Small Municipal Waste Combustion Units for...
40 CFR 60.1790 - What test methods must I use to stack test?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 7 2014-07-01 2014-07-01 false What test methods must I use to stack test? 60.1790 Section 60.1790 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Small Municipal Waste...
40 CFR 60.1300 - What test methods must I use to stack test?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 7 2014-07-01 2014-07-01 false What test methods must I use to stack test? 60.1300 Section 60.1300 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Small Municipal Waste Combustion Units for...
40 CFR 60.1790 - What test methods must I use to stack test?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 7 2013-07-01 2013-07-01 false What test methods must I use to stack test? 60.1790 Section 60.1790 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Small Municipal Waste...
40 CFR 60.1300 - What test methods must I use to stack test?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 7 2013-07-01 2013-07-01 false What test methods must I use to stack test? 60.1300 Section 60.1300 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Small Municipal Waste Combustion Units for...
40 CFR Table 5 of Subpart Aaaa of... - Requirements for Stack Tests
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 7 2014-07-01 2014-07-01 false Requirements for Stack Tests 5 Table 5 of Subpart AAAA of Part 60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Small Municipal Waste Combustion Units for...
40 CFR Table 5 of Subpart Aaaa of... - Requirements for Stack Tests
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 7 2013-07-01 2013-07-01 false Requirements for Stack Tests 5 Table 5 of Subpart AAAA of Part 60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Standards of Performance for Small Municipal Waste Combustion Units for...
40 CFR 49.129 - Rule for limiting emissions of sulfur dioxide.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine, nonroad vehicle, open burning, process source, reference method, refuse, residual fuel oil, solid fuel, stack, standard conditions...
40 CFR 49.129 - Rule for limiting emissions of sulfur dioxide.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine, nonroad vehicle, open burning, process source, reference method, refuse, residual fuel oil, solid fuel, stack, standard conditions...
SIMULATIONS OF TRANSVERSE STACKING IN THE NSLS-II BOOSTER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fliller III, R.; Shaftan, T.
2011-03-28
The NSLS-II injection system consists of a 200 MeV linac and a 3 GeV booster. The linac needs to deliver 15 nC in 80 - 150 bunches to the booster every minute to achieve current stability goals in the storage ring. This is a very stringent requirement that has not been demonstrated at an operating light source. We have developed a scheme to transversely stack two bunch trains in the NSLS-II booster in order to alleviate the charge requirements on the linac. This scheme has been outlined previously. In this paper we show particle tracking simulations of the tracking scheme.more » We show simulations of the booster ramp with a stacked beam for a variety of lattice errors and injected beam parameters. In all cases the performance of the proposed stacking method is sufficient to reduce the required charge from the linac. For this reason the injection system of the NSLS-II booster is being designed to include this feature. The NSLS-II injection system consists of a 200 MeV linac and a 3 GeV booster. The injectors must provide 7.5nC in bunch trains 80-150 bunches long every minute for top off operation of the storage ring. Top off then requires that the linac deliver 15nC of charge once losses in the injector chain are taken into consideration. This is a very stringent requirement that has not been demonstrated at an operating light source. For this reason we have developed a method to transversely stack two bunch trains in the booster while maintaining the charge transport efficiency. This stacking scheme has been discussed previously. In this paper we show the simulations of the booster ramp with a single bunch train in the booster. Then we give a brief overview of the stacking scheme. Following, we show the results of stacking two bunch trains in the booster with varying beam emittances and train separations. The behavior of the beam through the ramp is examined showing that it is possible to stack two bunch trains in the booster.« less
40 CFR 49.125 - Rule for limiting the emissions of particulate matter.
Code of Federal Regulations, 2011 CFR
2011-07-01
... pollution sources? (1) Particulate matter emissions from a combustion source stack (except for wood-fired..., British thermal unit (Btu), coal, combustion source, distillate fuel oil, emission, fuel, fuel oil, gaseous fuel, heat input, incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine...
Mining dynamic noteworthy functions in software execution sequences.
Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong
2017-01-01
As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.
Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle
NASA Astrophysics Data System (ADS)
Vinay, S.; Downs, R. R.
2012-12-01
Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.
NASA Astrophysics Data System (ADS)
Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom
2013-04-01
The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity ratios showed distinct differences between the closed CANDU primary coolant system and radiopharmaceutical production releases. According to the concept proposed by Kalinowski and Pistner (2006), the relationship between different isotopic activity ratios based on three or four radioxenon isotopes was plotted in a log-log diagram for source characterisation (civil vs. nuclear test). The multiple isotopic activity ratios were distributed in three distinct areas: HC atmospheric monitoring ratios extended to far left; the CANDU primary coolant system ratios lay in the middle; and 99Mo stack monitoring ratios for ANSTO and CRL were located on the right. The closed CANDU primary coolant has the lowest logarithmic mean ratio that represents the nuclear power reactor operation. The HC atmospheric monitoring exhibited a broad range of ratios spreading over several orders of magnitude. In contrast, the ANSTO and CRL stack emissions showed the smallest range of ratios but the results indicate at least two processes involved in the 99Mo productions. Overall, most measurements were found to be shifted towards the reactor domain. The hypothesis is that this is due to an accumulation of the isotope 131mXe in the stack or atmospheric background as it has the longest half-life and extra 131mXe emissions from the decay of 131I. The contribution of older 131mXe to a fresh release shifts the ratio of 133mXe/131mXe to the left. It was also very interesting to note that there were some situations where isotopic ratios from 99Mo production emissions fell within the nuclear test domain. This is due to operational variability, such as shorter target irradiation times. Martin B. Kalinowski and Christoph Pistner, (2006), Isotopic signature of atmospheric xenon released from light water reactors, Journal of Environmental Radioactivity, 88, 215-235.
NASA Astrophysics Data System (ADS)
Löwe, Peter; Barmuta, Jan; Klump, Jens; Neumann, Janna; Plank, Margret
2014-05-01
The communication of advances in research to the common public for both education and decision making is an important aspect of scientific work. An even more crucial task is to gain recognition within the scientific community, which is judged by impact factor and citation counts. Recently, the latter concepts have been extended from textual publications to include data and software publications. This paper presents a case study for science communication and data citation. For this, tectonic models, Free and Open Source Software (FOSS), best practices for data citation and a multimedia online-portal for scientific content are combined. This approach creates mutual benefits for the stakeholders: Target audiences receive information on the latest research results, while the use of Digital Object Identifiers (DOI) increases the recognition and citation of underlying scientific data. This creates favourable conditions for every researcher as DOI names ensure citeability and long term availability of scientific research. In the developed application, the FOSS tool for tectonic modelling GPlates is used to visualise and manipulate plate-tectonic reconstructions and associated data through geological time. These capabilities are augmented by the Science on a Halfsphere project (SoaH) with a robust and intuitive visualisation hardware environment. The tectonic models used for science communication are provided by the AGH University of Science and Technology. They focus on the Silurian to Early Carboniferous evolution of Central Europe (Bohemian Massif) and were interpreted for the area of the Geopark Bergstraße Odenwald based on the GPlates/SoaH hardware- and software stack. As scientific story-telling is volatile by nature, recordings are a natural means of preservation for further use, reference and analysis. For this, the upcoming portal for audiovisual media of the German National Library of Science and Technology TIB is expected to become a critical service infrastructure. It allows complex search queries, including metadata such as DOI and media fragment identifiers (MFI), thereby linking data citation and science communication.
Goodarzi, Fariborz; Sanei, Hamed; Labonté, Marcel; Duncan, William F
2002-06-01
The spatial distribution and deposition of lead and zinc emitted from the Trail smelter, British Columbia, Canada, was studied by strategically locating moss bags in the area surrounding the smelter and monitoring the deposition of elements every three months. A combined diffusion/distribution model was applied to estimate the relative contribution of stack-emitted material and material emitted from the secondary sources (e.g., wind-blown dust from ore/slag storage piles, uncovered transportation/trucking of ore, and historical dust). The results indicate that secondary sources are the major contributor of lead and zinc deposited within a short distance from the smelter. Gradually, the stack emissions become the main source of Pb and Zn at greater distances from the smelter. Typical material originating from each source was characterized by SEM/EDX, which indicated a marked difference in their morphology and chemical composition.
NASA Technical Reports Server (NTRS)
Pang, Jackson; Liddicoat, Albert; Ralston, Jesse; Pingree, Paula
2006-01-01
The current implementation of the Telecommunications Protocol Processing Subsystem Using Reconfigurable Interoperable Gate Arrays (TRIGA) is equipped with CFDP protocol and CCSDS Telemetry and Telecommand framing schemes to replace the CPU intensive software counterpart implementation for reliable deep space communication. We present the hardware/software co-design methodology used to accomplish high data rate throughput. The hardware CFDP protocol stack implementation is then compared against the two recent flight implementations. The results from our experiments show that TRIGA offers more than 3 orders of magnitude throughput improvement with less than one-tenth of the power consumption.
Software Selection: A Primer on Source and Evaluation.
ERIC Educational Resources Information Center
Burston, Jack
2003-01-01
Provides guidance on making decisions regarding the selection of foreign language instructional software. Identifies sources of foreign language software, indicates sources of foreign language software reviews, and outlines essential procedures of software evaluation. (Author/VWL)
40 CFR 49.129 - Rule for limiting emissions of sulfur dioxide.
Code of Federal Regulations, 2010 CFR
2010-07-01
... emissions from a combustion source stack must not exceed an average of 500 parts per million by volume, on a..., air pollution source, ambient air, British thermal unit (Btu), coal, combustion source, continuous..., incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine, nonroad vehicle, open burning...
40 CFR 49.129 - Rule for limiting emissions of sulfur dioxide.
Code of Federal Regulations, 2011 CFR
2011-07-01
... emissions from a combustion source stack must not exceed an average of 500 parts per million by volume, on a..., air pollution source, ambient air, British thermal unit (Btu), coal, combustion source, continuous..., incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine, nonroad vehicle, open burning...
40 CFR 49.125 - Rule for limiting the emissions of particulate matter.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., gaseous fuel, heat input, incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine..., residual fuel oil, solid fuel, stack, standard conditions, stationary source, uncombined water, used oil...
40 CFR 49.125 - Rule for limiting the emissions of particulate matter.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., gaseous fuel, heat input, incinerator, marine vessel, mobile sources, motor vehicle, nonroad engine..., residual fuel oil, solid fuel, stack, standard conditions, stationary source, uncombined water, used oil...
SO 2 concentrations near tall stacks
NASA Astrophysics Data System (ADS)
Lott, Robert A.
A study was conducted to investigate plume dispersion during convective (stability class A) conditions. The purpose of the study was to determine if high concentrations occur near sources (1.2-1.8 km) with tall stacks and to identify the plume behavior during these episodes. The study was conducted at the Tennessee Valley Authority's Paradise Steam Plant. The highest concentrations were measured near the source during wind shear capping conditions, which normally correspond to stability class B or C conditions. The measured data are compared with results obtained using a convective boundary layer model and a steady-state Gaussian model.
NASA Astrophysics Data System (ADS)
Kim, Do-Bin; Kwon, Dae Woong; Kim, Seunghyun; Lee, Sang-Ho; Park, Byung-Gook
2018-02-01
To obtain high channel boosting potential and reduce a program disturbance in channel stacked NAND flash memory with layer selection by multilevel (LSM) operation, a new program scheme using boosted common source line (CSL) is proposed. The proposed scheme can be achieved by applying proper bias to each layer through its own CSL. Technology computer-aided design (TCAD) simulations are performed to verify the validity of the new method in LSM. Through TCAD simulation, it is revealed that the program disturbance characteristics is effectively improved by the proposed scheme.
40 CFR Table 5 of Subpart Aaaa to... - Requirements for Stack Tests
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Requirements for Stack Tests 5 Table 5 of Subpart AAAA to Part 60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Pt. 60, Subpt. AAAA, Table 5 Table 5 of Subpart AAAA to Part 60—Requirement...
40 CFR Table 5 of Subpart Aaaa to... - Requirements for Stack Tests
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Requirements for Stack Tests 5 Table 5 of Subpart AAAA to Part 60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Pt. 60, Subpt. AAAA, Table 5 Table 5 of Subpart AAAA to Part 60—Requirement...
40 CFR 52.875 - Original identification of plan section.
Code of Federal Regulations, 2014 CFR
2014-07-01
... applicable to stationary sources subject to prevention of significant deterioration (PSD) permit requirements... interim stack height policy for each PSD permit issued until such time as EPA revises its general stack... submitted rule revisions to K.A.R. 28-19-17, the PSD rule; to K.A.R. 28-19-19, the CEM rule; and to K.A.R...
40 CFR 52.875 - Original identification of plan section.
Code of Federal Regulations, 2012 CFR
2012-07-01
... applicable to stationary sources subject to prevention of significant deterioration (PSD) permit requirements... interim stack height policy for each PSD permit issued until such time as EPA revises its general stack... submitted rule revisions to K.A.R. 28-19-17, the PSD rule; to K.A.R. 28-19-19, the CEM rule; and to K.A.R...
40 CFR 52.875 - Original identification of plan section.
Code of Federal Regulations, 2013 CFR
2013-07-01
... applicable to stationary sources subject to prevention of significant deterioration (PSD) permit requirements... interim stack height policy for each PSD permit issued until such time as EPA revises its general stack... submitted rule revisions to K.A.R. 28-19-17, the PSD rule; to K.A.R. 28-19-19, the CEM rule; and to K.A.R...
40 CFR 52.875 - Original identification of plan section.
Code of Federal Regulations, 2011 CFR
2011-07-01
... applicable to stationary sources subject to prevention of significant deterioration (PSD) permit requirements... interim stack height policy for each PSD permit issued until such time as EPA revises its general stack... submitted rule revisions to K.A.R. 28-19-17, the PSD rule; to K.A.R. 28-19-19, the CEM rule; and to K.A.R...
40 CFR Table 5 of Subpart Aaaa to... - Requirements for Stack Tests
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 7 2012-07-01 2012-07-01 false Requirements for Stack Tests 5 Table 5 of Subpart AAAA to Part 60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY SOURCES Pt. 60, Subpt. AAAA, Table 5 Table 5 of Subpart AAAA to Part 60—Requirement...
Wu, Chien-Hung; Chang, Kow-Ming; Chen, Yi-Ming; Huang, Bo-Wen; Zhang, Yu-Xin; Wang, Shui-Jinn; Hsu, Jui-Mei
2018-03-01
Atmospheric pressure plasma-enhanced chemical vapor deposition (AP-PECVD) was employed for the fabrication of indium gallium zinc oxide thin-film transistors (IGZO TFTs) with high transparent gallium zinc oxide (GZO) source/drain electrodes. The influence of post-deposition annealing (PDA) temperature on GZO source/drain and device performance was studied. Device with a 300 °C annealing demonstrated excellent electrical characteristics with on/off current ratio of 2.13 × 108, saturation mobility of 10 cm2/V-s, and low subthreshold swing of 0.2 V/dec. The gate stacked LaAlO3/ZrO2 of AP-IGZO TFTs with highly transparent and conductive AP-GZO source/drain electrode show excellent gate control ability at a low operating voltage.
NASA Astrophysics Data System (ADS)
Roten, D.; Hogue, S.; Spell, P.; Marland, E.; Marland, G.
2017-12-01
There is an increasing role for high resolution, CO2 emissions inventories across multiple arenas. The breadth of the applicability of high-resolution data is apparent from their use in atmospheric CO2 modeling, their potential for validation of space-based atmospheric CO2 remote-sensing, and the development of climate change policy. This work focuses on increasing our understanding of the uncertainty in these inventories and the implications on their downstream use. The industrial point sources of emissions (power generating stations, cement manufacturing plants, paper mills, etc.) used in the creation of these inventories often have robust emissions characteristics, beyond just their geographic location. Physical parameters of the emission sources such as number of exhaust stacks, stack heights, stack diameters, exhaust temperatures, and exhaust velocities, as well as temporal variability and climatic influences can be important in characterizing emissions. Emissions from large point sources can behave much differently than emissions from areal sources such as automobiles. For many applications geographic location is not an adequate characterization of emissions. This work demonstrates the sensitivities of atmospheric models to the physical parameters of large point sources and provides a methodology for quantifying parameter impacts at multiple locations across the United States. The sensitivities highlight the importance of location and timing and help to highlight potential aspects that can guide efforts to reduce uncertainty in emissions inventories and increase the utility of the models.
Real-time Stack Monitoring at the BaTek Medical Isotope Production Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
McIntyre, Justin I.; Agusbudiman, A.; Cameron, Ian M.
2016-04-01
Radioxenon emissions from radiopharmaceutical production are a major source of background concentrations affecting the radioxenon detection systems of the International Monitoring System (IMS). Collection of real-time emissions data from production facilities makes it possible to screen out some medical isotope signatures from the IMS radioxenon data sets. This paper describes an effort to obtain and analyze real-time stack emissions data with the design, construction and installation of a small stack monitoring system developed by a joint CTBTO-IDC, BATAN, and PNNL team at the BaTek medical isotope production facility near Jakarta, Indonesia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brant Peery; Sam Alessi; Randy Lee
2014-06-01
There is a need for a spatial decision support application that allows users to create customized metrics for comparing proposed locations of a new solar installation. This document discusses how PVMapper was designed to overcome the customization problem through the development of loosely coupled spatial and decision components in a JavaScript plugin architecture. This allows the user to easily add functionality and data to the system. The paper also explains how PVMapper provides the user with a dynamic and customizable decision tool that enables them to visually modify the formulas that are used in the decision algorithms that convert datamore » to comparable metrics. The technologies that make up the presentation and calculation software stack are outlined. This document also explains the architecture that allows the tool to grow through custom plugins created by the software users. Some discussion is given on the difficulties encountered while designing the system.« less
NASA Astrophysics Data System (ADS)
Karsten, Roman; Flittner, Klaus; Haus, Henry; Schlaak, Helmut F.
2013-04-01
This paper describes the development of an active isolation mat for cancelation of vibrations on sensitive devices with a mass of up to 500 gram. Vertical disturbing vibrations are attenuated actively while horizontal vibrations are damped passively. The dimensions of the investigated mat are 140 × 140 × 20 mm. The mat contains 5 dielectric elastomer stack actuators (DESA). The design and the optimization of active isolation mat are realized by ANSYS FEM software. The best performance shows a DESA with air cushion mounted on its circumference. Within the mounting encased air increases static and reduces dynamic stiffness. Experimental results show that vibrations with amplitudes up to 200 μm can be actively eliminated.
Advancements in high-power diode laser stacks for defense applications
NASA Astrophysics Data System (ADS)
Pandey, Rajiv; Merchen, David; Stapleton, Dean; Patterson, Steve; Kissel, Heiko; Fassbender, Wilhlem; Biesenbach, Jens
2012-06-01
This paper reports on the latest advancements in vertical high-power diode laser stacks using micro-channel coolers, which deliver the most compact footprint, power scalability and highest power/bar of any diode laser package. We present electro-optical (E-O) data on water-cooled stacks with wavelengths ranging from 7xx nm to 9xx nm and power levels of up to 5.8kW, delivered @ 200W/bar, CW mode, and a power-conversion efficiency of >60%, with both-axis collimation on a bar-to-bar pitch of 1.78mm. Also, presented is E-O data on a compact, conductively cooled, hardsoldered, stack package based on conventional CuW and AlN materials, with bar-to-bar pitch of 1.8mm, delivering average power/bar >15W operating up to 25% duty cycle, 10ms pulses @ 45C. The water-cooled stacks can be used as pump-sources for diode-pumped alkali lasers (DPALs) or for more traditional diode-pumped solid-state lasers (DPSSL). which are power/brightness scaled for directed energy weapons applications and the conductively-cooled stacks as illuminators.
NASA Astrophysics Data System (ADS)
Robaiah, M.; Rusop, M.; Abdullah, S.; Khusaimi, Z.; Azhan, H.; Fadzlinatul, M. Y.; Salifairus, M. J.; Asli, N. A.
2018-05-01
Palm oil has been used as the carbon source to synthesize carbon nanotubes (CNTs) on silicon substrates using the thermal chemical vapor deposition (CVD) method. Meanwhile, silicon has been applied using two techniques, which are stacked technique and non-stacked technique. The CNTs were grown at the constant time of 30 minutes with various synthesis temperatures of 750 °C, 850 °C and 950 °C. The CNTs were characterized using micro-Raman spectroscopy and field emission scanning electron microscopy (FESEM). It was found that the density, growth rate, diameter and length of the CNTs produced were affected by the synthesis temperature. Moreover, the structure slightly changes were observed between CNTs obtained in SS and NSS. The synthesize temperature of 750 °C was considered as the suitable temperature for the production of CNTs due to low ID/IG ratio, which for stacked is 0.89 and non-stacked are 0.90. The possible explanation for the different morphology of the produced CNTs was also discussed.
Software Systems for High-performance Quantum Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; Britt, Keith A
Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventionalmore » computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.« less
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
ERIC Educational Resources Information Center
Liu, Tsung-Yu
2016-01-01
This study investigates how educational games impact on students' academic performance and multimedia flow experiences in a computer science course. A curriculum consists of five basic learning units, that is, the stack, queue, sort, tree traversal, and binary search tree, was conducted for 110 university students during one semester. Two groups…
Peregrine Transition from CentOS6 to CentOS7 | High-Performance Computing |
). Users should consider them primarily as examples, which they can copy and modify for their own use with HPC environments. This can permit one-step access to pre-existing complex software stacks, or /projects. This is not a highly suggested mechanism, but might serve for one-time needs. In the unlikely
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Pernice
2010-09-01
INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.
2014 Runtime Systems Summit. Runtime Systems Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarkar, Vivek; Budimlic, Zoran; Kulkani, Milind
2016-09-19
This report summarizes runtime system challenges for exascale computing, that follow from the fundamental challenges for exascale systems that have been well studied in past reports, e.g., [6, 33, 34, 32, 24]. Some of the key exascale challenges that pertain to runtime systems include parallelism, energy efficiency, memory hierarchies, data movement, heterogeneous processors and memories, resilience, performance variability, dynamic resource allocation, performance portability, and interoperability with legacy code. In addition to summarizing these challenges, the report also outlines different approaches to addressing these significant challenges that have been pursued by research projects in the DOE-sponsored X-Stack and OS/R programs. Sincemore » there is often confusion as to what exactly the term “runtime system” refers to in the software stack, we include a section on taxonomy to clarify the terminology used by participants in these research projects. In addition, we include a section on deployment opportunities for vendors and government labs to build on the research results from these projects. Finally, this report is also intended to provide a framework for discussing future research and development investments for exascale runtime systems, and for clarifying the role of runtime systems in exascale software.« less
User's guide for RAM. Volume II. Data preparation and listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turner, D.B.; Novak, J.H.
1978-11-01
The information presented in this user's guide is directed to air pollution scientists having an interest in applying air quality simulation models. RAM is a method of estimating short-term dispersion using the Gaussian steady-state model. These algorithms can be used for estimating air quality concentrations of relatively nonreactive pollutants for averaging times from an hour to a day from point and area sources. The algorithms are applicable for locations with level or gently rolling terrain where a single wind vector for each hour is a good approximation to the flow over the source area considered. Calculations are performed for eachmore » hour. Hourly meteorological data required are wind direction, wind speed, temperature, stability class, and mixing height. Emission information required of point sources consists of source coordinates, emission rate, physical height, stack diameter, stack gas exit velocity, and stack gas temperature. Emission information required of area sources consists of southwest corner coordinates, source side length, total area emission rate and effective area source-height. Computation time is kept to a minimum by the manner in which concentrations from area sources are estimated using a narrow plume hypothesis and using the area source squares as given rather than breaking down all sources into an area of uniform elements. Options are available to the user to allow use of three different types of receptor locations: (1) those whose coordinates are input by the user, (2) those whose coordinates are determined by the model and are downwind of significant point and area sources where maxima are likely to occur, and (3) those whose coordinates are determined by the model to give good area coverage of a specific portion of the region. Computation time is also decreased by keeping the number of receptors to a minimum. Volume II presents RAM example outputs, typical run streams, variable glossaries, and Fortran source codes.« less
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
NASA Astrophysics Data System (ADS)
Sunil, V.; Venkata siva, G.; Yoganjaneyulu, G.; Ravikumar, V. V.
2017-08-01
The answer for an emission free power source in future is in the form of fuel cells which combine hydrogen and oxygen producing electricity and a harmless by product-water. A proton exchange membrane (PEM) fuel cell is ideal for automotive applications. A single cell cannot supply the essential power for any application. Hence PEM fuel cell stacks are used. The effect of different operating parameters namely: type of convection, type of draught, hydrogen flow rate, hydrogen inlet pressure, ambient temperature and humidity, hydrogen humidity, cell orientation on the performance of air breathing PEM fuel cell stack was analyzed using a computerized fuel cell test station. Then, the fuel cell stack was subjected to different load conditions. It was found that the stack performs very poorly at full capacity (runs only for 30 min. but runs for 3 hours at 50% capacity). Hence, a detailed study was undertaken to maximize the duration of the stack’s performance at peak load.
SOFC Microstructures (PFIB-SEM and synthetic) from JPS 2018
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, Tim; Epting, William K; Mahbub, Rubayyat
This is the microstructural data used in the publication "Mesoscale characterization of local property distributions in hetergeneous electrodes" by Tim Hsu, William K. Epting, Rubayyat Mahbub, et al., published in the Journal of Power Sources in 2018 (DOI 10.1016/j.jpowsour.2018.03.025). Included are a commercial cathode and anode active layer (Materials and Systems Research, Inc., Salt Lake City, UT) imaged by Xe plasma FIB-SEM (FEI, Hillsboro, OR), and four synthetic microstructures of varying particle size distribution widths generated by DREAM3D (BlueQuartz Software, Springboro, OH). For the MSRI electrodes, both the original greyscale and the segmented versions are provided. Each .zip file containsmore » a "stack" of .tif image files in the Z dimension, and an .info ascii text file containing useful information like voxel sizes and phase IDs. More details can be found in the pertinent publication at http://dx.doi.org/10.1016/j.jpowsour.2018.03.025.« less
Mining dynamic noteworthy functions in software execution sequences
Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong
2017-01-01
As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely. PMID:28278276
MODELING PHOTOCHEMISTRY AND AEROSOL FORMATION IN POINT SOURCE PLUMES WITH THE CMAQ PLUME-IN-GRID
Emissions of nitrogen oxides and sulfur oxides from the tall stacks of major point sources are important precursors of a variety of photochemical oxidants and secondary aerosol species. Plumes released from point sources exhibit rather limited dimensions and their growth is gradu...
A flexible continuous-variable QKD system using off-the-shelf components
NASA Astrophysics Data System (ADS)
Comandar, Lucian C.; Brunner, Hans H.; Bettelli, Stefano; Fung, Fred; Karinou, Fotini; Hillerkuss, David; Mikroulis, Spiros; Wang, Dawei; Kuschnerov, Maxim; Xie, Changsong; Poppe, Andreas; Peev, Momtchil
2017-10-01
We present the development of a robust and versatile CV-QKD architecture based on commercially available optical and electronic components. The system uses a pilot tone for phase synchronization with a local oscillator, as well as local feedback loops to mitigate frequency and polarization drifts. Transmit and receive-side digital signal processing is performed fully in software, allowing for rapid protocol reconfiguration. The quantum link is complemented with a software stack for secure-key processing, key storage and encrypted communication. All these features allow for the system to be at the same time a prototype for a future commercial product and a research platform.
Behind Linus's Law: Investigating Peer Review Processes in Open Source
ERIC Educational Resources Information Center
Wang, Jing
2013-01-01
Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…
NASA Astrophysics Data System (ADS)
Sugaya, Takeyoshi; Tayagaki, Takeshi; Aihara, Taketo; Makita, Kikuo; Oshima, Ryuji; Mizuno, Hidenori; Nagato, Yuki; Nakamoto, Takashi; Okano, Yoshinobu
2018-05-01
We report high-quality dual-junction GaAs solar cells grown using solid-source molecular beam epitaxy and their application to smart stacked III–V//Si quadruple-junction solar cells with a two-terminal configuration for the first time. A high open-circuit voltage of 2.94 eV was obtained in an InGaP/GaAs/GaAs triple-junction top cell that was stacked to a Si bottom cell. The short-circuit current density of a smart stacked InGaP/GaAs/GaAs//Si solar cell was in good agreement with that estimated from external quantum efficiency measurements. An efficiency of 18.5% with a high open-circuit voltage of 3.3 V was obtained in InGaP/GaAs/GaAs//Si two-terminal solar cells.
Characterization of diode-laser stacks for high-energy-class solid state lasers
NASA Astrophysics Data System (ADS)
Pilar, Jan; Sikocinski, Pawel; Pranowicz, Alina; Divoky, Martin; Crump, P.; Staske, R.; Lucianetti, Antonio; Mocek, Tomas
2014-03-01
In this work, we present a comparative study of high power diode stacks produced by world's leading manufacturers such as DILAS, Jenoptik, and Quantel. The diode-laser stacks are characterized by central wavelength around 939 nm, duty cycle of 1 %, and maximum repetition rate of 10 Hz. The characterization includes peak power, electrical-to-optical efficiency, central wavelength and full width at half maximum (FWHM) as a function of diode current and cooling temperature. A cross-check of measurements performed at HiLASE-IoP and Ferdinand-Braun-Institut (FBH) shows very good agreement between the results. Our study reveals also the presence of discontinuities in the spectra of two diode stacks. We consider the results presented here a valuable tool to optimize pump sources for ultra-high average power lasers, including laser fusion facilities.
WebLogo: A Sequence Logo Generator
Crooks, Gavin E.; Hon, Gary; Chandonia, John-Marc; Brenner, Steven E.
2004-01-01
WebLogo generates sequence logos, graphical representations of the patterns within a multiple sequence alignment. Sequence logos provide a richer and more precise description of sequence similarity than consensus sequences and can rapidly reveal significant features of the alignment otherwise difficult to perceive. Each logo consists of stacks of letters, one stack for each position in the sequence. The overall height of each stack indicates the sequence conservation at that position (measured in bits), whereas the height of symbols within the stack reflects the relative frequency of the corresponding amino or nucleic acid at that position. WebLogo has been enhanced recently with additional features and options, to provide a convenient and highly configurable sequence logo generator. A command line interface and the complete, open WebLogo source code are available for local installation and customization. PMID:15173120
ESO imaging survey: optical deep public survey
NASA Astrophysics Data System (ADS)
Mignano, A.; Miralles, J.-M.; da Costa, L.; Olsen, L. F.; Prandoni, I.; Arnouts, S.; Benoist, C.; Madejsky, R.; Slijkhuis, R.; Zaggia, S.
2007-02-01
This paper presents new five passbands (UBVRI) optical wide-field imaging data accumulated as part of the DEEP Public Survey (DPS) carried out as a public survey by the ESO Imaging Survey (EIS) project. Out of the 3 square degrees originally proposed, the survey covers 2.75 square degrees, in at least one band (normally R), and 1.00 square degrees in five passbands. The median seeing, as measured in the final stacked images, is 0.97 arcsec, ranging from 0.75 arcsec to 2.0 arcsec. The median limiting magnitudes (AB system, 2´´ aperture, 5σ detection limit) are UAB=25.65, BAB=25.54, VAB=25.18, RAB = 24.8 and IAB =24.12 mag, consistent with those proposed in the original survey design. The paper describes the observations and data reduction using the EIS Data Reduction System and its associated EIS/MVM library. The quality of the individual images were inspected, bad images discarded and the remaining used to produce final image stacks in each passband, from which sources have been extracted. Finally, the scientific quality of these final images and associated catalogs was assessed qualitatively by visual inspection and quantitatively by comparison of statistical measures derived from these data with those of other authors as well as model predictions, and from direct comparison with the results obtained from the reduction of the same dataset using an independent (hands-on) software system. Finally to illustrate one application of this survey, the results of a preliminary effort to identify sub-mJy radio sources are reported. To the limiting magnitude reached in the R and I passbands the success rate ranges from 66 to 81% (depending on the fields). These data are publicly available at CDS. Based on observations carried out at the European Southern Observatory, La Silla, Chile under program Nos. 164.O-0561, 169.A-0725, and 267.A-5729. Appendices A, B and C are only available in electronic form at http://www.aanda.org
Unveiling high redshift structures with Planck
NASA Astrophysics Data System (ADS)
Welikala, Niraj
2012-07-01
The Planck satellite, with its large wavelength coverage and all-sky survey, has a unique potential of systematically detecting the brightest and rarest submillimetre sources on the sky. We present an original method based on a combination of Planck and IRAS data which we use to select the most luminous submillimetre high-redshift (z>1-2) cold sources over the sky. The majority of these sources are either individual, strongly lensed galaxies, or represent the combined emission of several submillimetre galaxies within the large beam of Planck. The latter includes, in particular, rapidly growing galaxy groups and clusters. We demonstrate our selection method on the first 5 confirmations that include a newly discovered over-density of 5 submillimetre-bright sources which has been confirmed with Herschel/SPIRE observations and followed up with ground-based observations including VLT/XSHOOTER spectroscopy. Using Planck, we also unveil the nature of 107 high-redshift dusty, lensed submillimetre galaxies that have been previously observed over 940 square degrees by the South Pole Telescope (SPT). We stack these galaxies in the Planck maps, obtaining mean SEDs for both the bright (SPT flux F _{220 GHz} > 20 mJy) and faint (F _{220 GHz} < 20 mJy) galaxy populations. These SEDs and the derived mean redshifts suggest that the bright and faint sources belong to the same population of submillimetre galaxies. Stacking the lensed submillimetre galaxies in Planck also enables us to probe the z~1 environments around the foreground lenses and we obtain estimates of their clustering. Finally, we use the stacks to extrapolate SPT source counts to the Planck HFI frequencies, thereby estimating the contribution of the SPT sources at 220 GHz to the galaxy number counts at 353 and 545 GHz.
``Carbon Credits'' for Resource-Bounded Computations Using Amortised Analysis
NASA Astrophysics Data System (ADS)
Jost, Steffen; Loidl, Hans-Wolfgang; Hammond, Kevin; Scaife, Norman; Hofmann, Martin
Bounding resource usage is important for a number of areas, notably real-time embedded systems and safety-critical systems. In this paper, we present a fully automatic static type-based analysis for inferring upper bounds on resource usage for programs involving general algebraic datatypes and full recursion. Our method can easily be used to bound any countable resource, without needing to revisit proofs. We apply the analysis to the important metrics of worst-case execution time, stack- and heap-space usage. Our results from several realistic embedded control applications demonstrate good matches between our inferred bounds and measured worst-case costs for heap and stack usage. For time usage we infer good bounds for one application. Where we obtain less tight bounds, this is due to the use of software floating-point libraries.
Energy harvesting through a backpack employing a mechanically amplified piezoelectric stack
NASA Astrophysics Data System (ADS)
Feenstra, Joel; Granstrom, Jon; Sodano, Henry
2008-04-01
Over the past few decades, the use of portable and wearable electronics has grown steadily. These devices are becoming increasingly more powerful, however, the gains that have been made in the device performance has resulted in the need for significantly higher power to operate the electronics. This issue has been further complicated due to the stagnate growth of battery technology over the past decade. In order to increase the life of these electronics, researchers have begun investigating methods of generating energy from ambient sources such that the life of the electronics can be prolonged. Recent developments in the field have led to the design of a number of mechanisms that can be used to generate electrical energy, from a variety of sources including thermal, solar, strain, inertia, etc. Many of these energy sources are available for use with humans, but their use must be carefully considered such that parasitic effects that could disrupt the user's gait or endurance are avoided. This study develops a novel energy harvesting backpack that can generate electrical energy from the differential forces between the wearer and the pack. The goal of this system is to make the energy harvesting device transparent to the wearer such that his or her endurance and dexterity is not compromised. This will be accomplished by replacing the strap buckle with a mechanically amplified piezoelectric stack actuator. Piezoelectric stack actuators have found little use in energy harvesting applications due to their high stiffness which makes straining the material difficult. This issue will be alleviated using a mechanically amplified stack which allows the relatively low forces generated by the pack to be transformed to high forces on the piezoelectric stack. This paper will develop a theoretical model of the piezoelectric buckle and perform experimental testing to validate the model accuracy and energy harvesting performance.
The Emergence of Open-Source Software in North America
ERIC Educational Resources Information Center
Pan, Guohua; Bonk, Curtis J.
2007-01-01
Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…
NASA Astrophysics Data System (ADS)
Jermyn, Michael; Ghadyani, Hamid; Mastanduno, Michael A.; Turner, Wes; Davis, Scott C.; Dehghani, Hamid; Pogue, Brian W.
2013-08-01
Multimodal approaches that combine near-infrared (NIR) and conventional imaging modalities have been shown to improve optical parameter estimation dramatically and thus represent a prevailing trend in NIR imaging. These approaches typically involve applying anatomical templates from magnetic resonance imaging/computed tomography/ultrasound images to guide the recovery of optical parameters. However, merging these data sets using current technology requires multiple software packages, substantial expertise, significant time-commitment, and often results in unacceptably poor mesh quality for optical image reconstruction, a reality that represents a significant roadblock for translational research of multimodal NIR imaging. This work addresses these challenges directly by introducing automated digital imaging and communications in medicine image stack segmentation and a new one-click three-dimensional mesh generator optimized for multimodal NIR imaging, and combining these capabilities into a single software package (available for free download) with a streamlined workflow. Image processing time and mesh quality benchmarks were examined for four common multimodal NIR use-cases (breast, brain, pancreas, and small animal) and were compared to a commercial image processing package. Applying these tools resulted in a fivefold decrease in image processing time and 62% improvement in minimum mesh quality, in the absence of extra mesh postprocessing. These capabilities represent a significant step toward enabling translational multimodal NIR research for both expert and nonexpert users in an open-source platform.
Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing
NASA Astrophysics Data System (ADS)
Tang, Jingyin; Matyas, Corene J.
2018-02-01
Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.
NASA Astrophysics Data System (ADS)
Hodgkins, Alex Liam; Diez, Victor; Hegner, Benedikt
2012-12-01
The Software Process & Infrastructure (SPI) project provides a build infrastructure for regular integration testing and release of the LCG Applications Area software stack. In the past, regular builds have been provided using a system which has been constantly growing to include more features like server-client communication, long-term build history and a summary web interface using present-day web technologies. However, the ad-hoc style of software development resulted in a setup that is hard to monitor, inflexible and difficult to expand. The new version of the infrastructure is based on the Django Python framework, which allows for a structured and modular design, facilitating later additions. Transparency in the workflows and ease of monitoring has been one of the priorities in the design. Formerly missing functionality like on-demand builds or release triggering will support the transition to a more agile development process.
NEXUS - Resilient Intelligent Middleware
NASA Astrophysics Data System (ADS)
Kaveh, N.; Hercock, R. Ghanea
Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.
Scalable and fail-safe deployment of the ATLAS Distributed Data Management system Rucio
NASA Astrophysics Data System (ADS)
Lassnig, M.; Vigne, R.; Beermann, T.; Barisits, M.; Garonne, V.; Serfon, C.
2015-12-01
This contribution details the deployment of Rucio, the ATLAS Distributed Data Management system. The main complication is that Rucio interacts with a wide variety of external services, and connects globally distributed data centres under different technological and administrative control, at an unprecedented data volume. It is therefore not possible to create a duplicate instance of Rucio for testing or integration. Every software upgrade or configuration change is thus potentially disruptive and requires fail-safe software and automatic error recovery. Rucio uses a three-layer scaling and mitigation strategy based on quasi-realtime monitoring. This strategy mainly employs independent stateless services, automatic failover, and service migration. The technologies used for deployment and mitigation include OpenStack, Puppet, Graphite, HAProxy and Apache. In this contribution, the interplay between these components, their deployment, software mitigation, and the monitoring strategy are discussed.
RECENT ADVANCES IN HIGH TEMPERATURE ELECTROLYSIS AT IDAHO NATIONAL LABORATORY: STACK TESTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
X, Zhang; J. E. O'Brien; R. C. O'Brien
2012-07-01
High temperature steam electrolysis is a promising technology for efficient sustainable large-scale hydrogen production. Solid oxide electrolysis cells (SOECs) are able to utilize high temperature heat and electric power from advanced high-temperature nuclear reactors or renewable sources to generate carbon-free hydrogen at large scale. However, long term durability of SOECs needs to be improved significantly before commercialization of this technology. A degradation rate of 1%/khr or lower is proposed as a threshold value for commercialization of this technology. Solid oxide electrolysis stack tests have been conducted at Idaho National Laboratory to demonstrate recent improvements in long-term durability of SOECs. Electrolytesupportedmore » and electrode-supported SOEC stacks were provided by Ceramatec Inc., Materials and Systems Research Inc. (MSRI), and Saint Gobain Advanced Materials (St. Gobain), respectively for these tests. Long-term durability tests were generally operated for a duration of 1000 hours or more. Stack tests based on technology developed at Ceramatec and MSRI have shown significant improvement in durability in the electrolysis mode. Long-term degradation rates of 3.2%/khr and 4.6%/khr were observed for MSRI and Ceramatec stacks, respectively. One recent Ceramatec stack even showed negative degradation (performance improvement) over 1900 hours of operation. A three-cell short stack provided by St. Gobain, however, showed rapid degradation in the electrolysis mode. Improvements on electrode materials, interconnect coatings, and electrolyteelectrode interface microstructures contribute to better durability of SOEC stacks.« less
Schroedinger’s code: Source code availability and transparency in astrophysics
NASA Astrophysics Data System (ADS)
Ryan, PW; Allen, Alice; Teuben, Peter
2018-01-01
Astronomers use software for their research, but how many of the codes they use are available as source code? We examined a sample of 166 papers from 2015 for clearly identified software use, then searched for source code for the software packages mentioned in these research papers. We categorized the software to indicate whether source code is available for download and whether there are restrictions to accessing it, and if source code was not available, whether some other form of the software, such as a binary, was. Over 40% of the source code for the software used in our sample was not available for download.As URLs have often been used as proxy citations for software, we also extracted URLs from one journal’s 2015 research articles, removed those from certain long-term, reliable domains, and tested the remainder to determine what percentage of these URLs were still accessible in September and October, 2017.
Free for All: Open Source Software
ERIC Educational Resources Information Center
Schneider, Karen
2008-01-01
Open source software has become a catchword in libraryland. Yet many remain unclear about open source's benefits--or even what it is. So what is open source software (OSS)? It's software that is free in every sense of the word: free to download, free to use, and free to view or modify. Most OSS is distributed on the Web and one doesn't need to…
40 CFR 61.33 - Stack sampling.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of the effective date in the case of an existing source or a new source which has an initial startup date preceding the effective date; or (2) Within 90 days of startup in the case of a new source which did not have an initial startup date preceding the effective date. (b) The Administrator shall be...
40 CFR 61.33 - Stack sampling.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of the effective date in the case of an existing source or a new source which has an initial startup date preceding the effective date; or (2) Within 90 days of startup in the case of a new source which did not have an initial startup date preceding the effective date. (b) The Administrator shall be...
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
Unitized regenerative fuel cell system
NASA Technical Reports Server (NTRS)
Burke, Kenneth A. (Inventor)
2008-01-01
A Unitized Regenerative Fuel Cell system uses heat pipes to convey waste heat from the fuel cell stack to the reactant storage tanks. The storage tanks act as heat sinks/sources and as passive radiators of the waste heat from the fuel cell stack. During charge up, i.e., the electrolytic process, gases are conveyed to the reactant storage tanks by way of tubes that include dryers. Reactant gases moving through the dryers give up energy to the cold tanks, causing water vapor in with the gases to condense and freeze on the internal surfaces of the dryer. During operation in its fuel cell mode, the heat pipes convey waste heat from the fuel cell stack to the respective reactant storage tanks, thereby heating them such that the reactant gases, as they pass though the respective dryers on their way to the fuel cell stacks retrieve the water previously removed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Dong; Heidelberger, Philip; Sugawara, Yutaka
An apparatus and method for extending the scalability and improving the partitionability of networks that contain all-to-all links for transporting packet traffic from a source endpoint to a destination endpoint with low per-endpoint (per-server) cost and a small number of hops. An all-to-all wiring in the baseline topology is decomposed into smaller all-to-all components in which each smaller all-to-all connection is replaced with star topology by using global switches. Stacking multiple copies of the star topology baseline network creates a multi-planed switching topology for transporting packet traffic. Point-to-point unified stacking method using global switch wiring methods connects multiple planes ofmore » a baseline topology by using the global switches to create a large network size with a low number of hops, i.e., low network latency. Grouped unified stacking method increases the scalability (network size) of a stacked topology.« less
High-frequency self-aligned graphene transistors with transferred gate stacks.
Cheng, Rui; Bai, Jingwei; Liao, Lei; Zhou, Hailong; Chen, Yu; Liu, Lixin; Lin, Yung-Chen; Jiang, Shan; Huang, Yu; Duan, Xiangfeng
2012-07-17
Graphene has attracted enormous attention for radio-frequency transistor applications because of its exceptional high carrier mobility, high carrier saturation velocity, and large critical current density. Herein we report a new approach for the scalable fabrication of high-performance graphene transistors with transferred gate stacks. Specifically, arrays of gate stacks are first patterned on a sacrificial substrate, and then transferred onto arbitrary substrates with graphene on top. A self-aligned process, enabled by the unique structure of the transferred gate stacks, is then used to position precisely the source and drain electrodes with minimized access resistance or parasitic capacitance. This process has therefore enabled scalable fabrication of self-aligned graphene transistors with unprecedented performance including a record-high cutoff frequency up to 427 GHz. Our study defines a unique pathway to large-scale fabrication of high-performance graphene transistors, and holds significant potential for future application of graphene-based devices in ultra-high-frequency circuits.
Development of Thread-compatible Open Source Stack
NASA Astrophysics Data System (ADS)
Zimmermann, Lukas; Mars, Nidhal; Schappacher, Manuel; Sikora, Axel
2017-07-01
The Thread protocol is a recent development based on 6LoWPAN (IPv6 over IEEE 802.15.4), but with extensions regarding a more media independent approach, which - additionally - also promises true interoperability. To evaluate and analyse the operation of a Thread network a given open source 6LoWPAN stack for embedded devices (emb::6) has been extended in order to comply with the Thread specification. The implementation covers Mesh Link Establishment (MLE) and network layer functionality as well as 6LoWPAN mesh under routing mechanism based on MAC short addresses. The development has been verified on a virtualization platform and allows dynamical establishment of network topologies based on Thread’s partitioning algorithm.
MISTY PICTURE EVENT, Test Execution Report
1987-11-30
testbed at overpressures ranging from 10 psi (83 kPa) to 3.4 psi (23 kPa). A series of experiments were positioned near the Thermal Radiation Sources...to include scheduling, construction, photography, and recording systems. (2) Formulate and direct the safety and security plans for the test series and...ANFO stacked charges multiburst test at Planet Ranch, AZ in 1978, e. MILL RACE (MISTY CASTLE Series I) - 600 ton ANFO surface stacked charge at WSMR in
Developing open-source codes for electromagnetic geophysics using industry support
NASA Astrophysics Data System (ADS)
Key, K.
2017-12-01
Funding for open-source software development in academia often takes the form of grants and fellowships awarded by government bodies and foundations where there is no conflict-of-interest between the funding entity and the free dissemination of the open-source software products. Conversely, funding for open-source projects in the geophysics industry presents challenges to conventional business models where proprietary licensing offers value that is not present in open-source software. Such proprietary constraints make it easier to convince companies to fund academic software development under exclusive software distribution agreements. A major challenge for obtaining commercial funding for open-source projects is to offer a value proposition that overcomes the criticism that such funding is a give-away to the competition. This work draws upon a decade of experience developing open-source electromagnetic geophysics software for the oil, gas and minerals exploration industry, and examines various approaches that have been effective for sustaining industry sponsorship.
Stacked waveguide reactors with gradient embedded scatterers for high-capacity water cleaning
Ahsan, Syed Saad; Gumus, Abdurrahman; Erickson, David
2015-11-04
We present a compact water-cleaning reactor with stacked layers of waveguides containing gradient patterns of optical scatterers that enable uniform light distribution and augmented water-cleaning rates. Previous photocatalytic reactors using immersion, external, or distributive lamps suffer from poor light distribution that impedes scalability. Here, we use an external UV-source to direct photons into stacked waveguide reactors where we scatter the photons uniformly over the length of the waveguide to thin films of TiO 2-catalysts. In conclusion, we also show 4.5 times improvement in activity over uniform scatterer designs, demonstrate a degradation of 67% of the organic dye, and characterize themore » degradation rate constant.« less
Stacked waveguide reactors with gradient embedded scatterers for high-capacity water cleaning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahsan, Syed Saad; Gumus, Abdurrahman; Erickson, David
We present a compact water-cleaning reactor with stacked layers of waveguides containing gradient patterns of optical scatterers that enable uniform light distribution and augmented water-cleaning rates. Previous photocatalytic reactors using immersion, external, or distributive lamps suffer from poor light distribution that impedes scalability. Here, we use an external UV-source to direct photons into stacked waveguide reactors where we scatter the photons uniformly over the length of the waveguide to thin films of TiO 2-catalysts. In conclusion, we also show 4.5 times improvement in activity over uniform scatterer designs, demonstrate a degradation of 67% of the organic dye, and characterize themore » degradation rate constant.« less
Improved Durability of SOEC Stacks for High Temperature Electrolysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
James E. O'Brien; Robert C. O'Brien; Xiaoyu Zhang
2013-01-01
High temperature steam electrolysis is a promising technology for efficient and sustainable large-scale hydrogen production. Solid oxide electrolysis cells (SOECs) are able to utilize high temperature heat and electric power from advanced high-temperature nuclear reactors or renewable sources to generate carbon-free hydrogen at large scale. However, long term durability of SOECs needs to be improved significantly before commercialization of this technology can be realized. A degradation rate of 1%/khr or lower is proposed as a threshold value for commercialization of this technology. Solid oxide electrolysis stack tests have been conducted at Idaho National Laboratory to demonstrate recent improvements in long-termmore » durability of SOECs. Electrolyte-supported and electrode-supported SOEC stacks were provided by Ceramatec Inc. and Materials and Systems Research Inc. (MSRI), respectively, for these tests. Long-term durability tests were generally operated for a duration of 1000 hours or more. Stack tests based on technologies developed at Ceramatec and MSRI have shown significant improvement in durability in the electrolysis mode. Long-term degradation rates of 3.2%/khr and 4.6%/khr were observed for MSRI and Ceramatec stacks, espectively. One recent Ceramatec stack even showed negative degradation (performance improvement) over 1900 hours of operation. Optimization of electrode materials, interconnect coatings, and electrolyte-electrode interface microstructures contribute to better durability of SOEC stacks.« less
Fenrich, Keith K; Zhao, Ethan Y; Wei, Yuan; Garg, Anirudh; Rose, P Ken
2014-04-15
Isolating specific cellular and tissue compartments from 3D image stacks for quantitative distribution analysis is crucial for understanding cellular and tissue physiology under normal and pathological conditions. Current approaches are limited because they are designed to map the distributions of synapses onto the dendrites of stained neurons and/or require specific proprietary software packages for their implementation. To overcome these obstacles, we developed algorithms to Grow and Shrink Volumes of Interest (GSVI) to isolate specific cellular and tissue compartments from 3D image stacks for quantitative analysis and incorporated these algorithms into a user-friendly computer program that is open source and downloadable at no cost. The GSVI algorithm was used to isolate perivascular regions in the cortex of live animals and cell membrane regions of stained spinal motoneurons in histological sections. We tracked the real-time, intravital biodistribution of injected fluorophores with sub-cellular resolution from the vascular lumen to the perivascular and parenchymal space following a vascular microlesion, and mapped the precise distributions of membrane-associated KCC2 and gephyrin immunolabeling in dendritic and somatic regions of spinal motoneurons. Compared to existing approaches, the GSVI approach is specifically designed for isolating perivascular regions and membrane-associated regions for quantitative analysis, is user-friendly, and free. The GSVI algorithm is useful to quantify regional differences of stained biomarkers (e.g., cell membrane-associated channels) in relation to cell functions, and the effects of therapeutic strategies on the redistributions of biomolecules, drugs, and cells in diseased or injured tissues. Copyright © 2014 Elsevier B.V. All rights reserved.
Information fusion via isocortex-based Area 37 modeling
NASA Astrophysics Data System (ADS)
Peterson, James K.
2004-08-01
A simplified model of information processing in the brain can be constructed using primary sensory input from two modalities (auditory and visual) and recurrent connections to the limbic subsystem. Information fusion would then occur in Area 37 of the temporal cortex. The creation of meta concepts from the low order primary inputs is managed by models of isocortex processing. Isocortex algorithms are used to model parietal (auditory), occipital (visual), temporal (polymodal fusion) cortex and the limbic system. Each of these four modules is constructed out of five cortical stacks in which each stack consists of three vertically oriented six layer isocortex models. The input to output training of each cortical model uses the OCOS (on center - off surround) and FFP (folded feedback pathway) circuitry of (Grossberg, 1) which is inherently a recurrent network type of learning characterized by the identification of perceptual groups. Models of this sort are thus closely related to cognitive models as it is difficult to divorce the sensory processing subsystems from the higher level processing in the associative cortex. The overall software architecture presented is biologically based and is presented as a potential architectural prototype for the development of novel sensory fusion strategies. The algorithms are motivated to some degree by specific data from projects on musical composition and autonomous fine art painting programs, but only in the sense that these projects use two specific types of auditory and visual cortex data. Hence, the architectures are presented for an artificial information processing system which utilizes two disparate sensory sources. The exact nature of the two primary sensory input streams is irrelevant.
NASA Astrophysics Data System (ADS)
Di Domenico, Giovanni; Zavattini, Guido; Cesca, Nicola; Auricchio, Natalia; Andritschke, Robert; Schopper, Florian; Kanbach, Gottfried
2007-02-01
We investigated with Monte Carlo simulations, using the EGSNrcMP code, the capabilities of a small animal PET scanner based on four stacks of double-sided silicon strip detectors. Each stack consists of 40 silicon detectors with dimension of 60×60×1 mm 3 and 128 orthogonal strips on each side. Two coordinates of the interaction are given by the strips, whereas the third coordinate is given by the detector number in the stack. The stacks are arranged to form a box of 5×5×6 cm 3 with minor sides opened; the box represents the minimal FOV of the scanner. The performance parameters of the SiliPET scanner have been estimated giving a (positron range limited) spatial resolution of 0.52 mm FWHM, and an absolute sensitivity of 5.1% at the center of system. Preliminary results of a proof of principle measurement done with the MEGA advanced Compton imager using a ≈1 mm diameter 22Na source, showed a focal ray tracing FWHM of 1 mm.
Ciobanu, O
2009-01-01
The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.
enoLOGOS: a versatile web tool for energy normalized sequence logos
Workman, Christopher T.; Yin, Yutong; Corcoran, David L.; Ideker, Trey; Stormo, Gary D.; Benos, Panayiotis V.
2005-01-01
enoLOGOS is a web-based tool that generates sequence logos from various input sources. Sequence logos have become a popular way to graphically represent DNA and amino acid sequence patterns from a set of aligned sequences. Each position of the alignment is represented by a column of stacked symbols with its total height reflecting the information content in this position. Currently, the available web servers are able to create logo images from a set of aligned sequences, but none of them generates weighted sequence logos directly from energy measurements or other sources. With the advent of high-throughput technologies for estimating the contact energy of different DNA sequences, tools that can create logos directly from binding affinity data are useful to researchers. enoLOGOS generates sequence logos from a variety of input data, including energy measurements, probability matrices, alignment matrices, count matrices and aligned sequences. Furthermore, enoLOGOS can represent the mutual information of different positions of the consensus sequence, a unique feature of this tool. Another web interface for our software, C2H2-enoLOGOS, generates logos for the DNA-binding preferences of the C2H2 zinc-finger transcription factor family members. enoLOGOS and C2H2-enoLOGOS are accessible over the web at . PMID:15980495
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lisanti, Mariangela; Mishra-Sharma, Siddharth; Rodd, Nicholas L.
Dark matter in the halos surrounding galaxy groups and clusters can annihilate to high-energy photons. Recent advancements in the construction of galaxy group catalogs provide many thousands of potential extragalactic targets for dark matter. In this paper, we outline a procedure to infer the dark matter signal associated with a given galaxy group. Applying this procedure to a catalog of sources, one can create a full-sky map of the brightest extragalactic dark matter targets in the nearby Universe (z≲0.03), supplementing sources of dark matter annihilation from within the local group. As with searches for dark matter in dwarf galaxies, thesemore » extragalactic targets can be stacked together to enhance the signals associated with dark matter. We validate this procedure on mock Fermi gamma-ray data sets using a galaxy catalog constructed from the DarkSky N-body cosmological simulation and demonstrate that the limits are robust, at O(1) levels, to systematic uncertainties on halo mass and concentration. We also quantify other sources of systematic uncertainty arising from the analysis and modeling assumptions. Lastly, our results suggest that a stacking analysis using galaxy group catalogs provides a powerful opportunity to discover extragalactic dark matter and complements existing studies of Milky Way dwarf galaxies.« less
NASA Astrophysics Data System (ADS)
Lisanti, Mariangela; Mishra-Sharma, Siddharth; Rodd, Nicholas L.; Safdi, Benjamin R.; Wechsler, Risa H.
2018-03-01
Dark matter in the halos surrounding galaxy groups and clusters can annihilate to high-energy photons. Recent advancements in the construction of galaxy group catalogs provide many thousands of potential extragalactic targets for dark matter. In this paper, we outline a procedure to infer the dark matter signal associated with a given galaxy group. Applying this procedure to a catalog of sources, one can create a full-sky map of the brightest extragalactic dark matter targets in the nearby Universe (z ≲0.03 ), supplementing sources of dark matter annihilation from within the local group. As with searches for dark matter in dwarf galaxies, these extragalactic targets can be stacked together to enhance the signals associated with dark matter. We validate this procedure on mock Fermi gamma-ray data sets using a galaxy catalog constructed from the DarkSky N -body cosmological simulation and demonstrate that the limits are robust, at O (1 ) levels, to systematic uncertainties on halo mass and concentration. We also quantify other sources of systematic uncertainty arising from the analysis and modeling assumptions. Our results suggest that a stacking analysis using galaxy group catalogs provides a powerful opportunity to discover extragalactic dark matter and complements existing studies of Milky Way dwarf galaxies.
NASA Astrophysics Data System (ADS)
MacLean, L. S.; Romanowicz, B. A.; French, S.
2015-12-01
Seismic wavefield computations using the Spectral Element Method are now regularly used to recover tomographic images of the upper mantle and crust at the local, regional, and global scales (e.g. Fichtner et al., GJI, 2009; Tape et al., Science 2010; Lekic and Romanowicz, GJI, 2011; French and Romanowicz, GJI, 2014). However, the heaviness of the computations remains a challenge, and contributes to limiting the resolution of the produced images. Using source stacking, as suggested by Capdeville et al. (GJI,2005), can considerably speed up the process by reducing the wavefield computations to only one per each set of N sources. This method was demonstrated through synthetic tests on low frequency datasets, and therefore should work for global mantle tomography. However, the large amplitudes of surface waves dominates the stacked seismograms and these cases can no longer be separated by windowing in the time domain. We have developed a processing approach that helps address this issue and demonstrate its usefulness through a series of synthetic tests performed at long periods (T >60 s) on toy upper mantle models. The summed synthetics are computed using the CSEM code (Capdeville et al., 2002). As for the inverse part of the procedure, we use a quasi-Newton method, computing Frechet derivatives and Hessian using normal mode perturbation theory.
Vacuum MOCVD fabrication of high efficience cells
NASA Technical Reports Server (NTRS)
Partain, L. D.; Fraas, L. M.; Mcleod, P. S.; Cape, J. A.
1985-01-01
Vacuum metal-organic-chemical-vapor-deposition (MOCVD) is a new fabrication process with improved safety and easier scalability due to its metal rather than glass construction and its uniform multiport gas injection system. It uses source materials more efficiently than other methods because the vacuum molecular flow conditions allow the high sticking coefficient reactants to reach the substrates as undeflected molecular beams and the hot chamber walls cause the low sticking coefficient reactants to bounce off the walls and interact with the substrates many times. This high source utilization reduces the materials costs power device and substantially decreases the amounts of toxic materials that must be handled as process effluents. The molecular beams allow precise growth control. With improved source purifications, vacuum MOCVD has provided p GaAs layers with 10-micron minority carrier diffusion lengths and GaAs and GaAsSb solar cells with 20% AMO efficiencies at 59X and 99X sunlight concentration ratios. Mechanical stacking has been identified as the quickest, most direct and logical path to stacked multiple-junction solar cells that perform better than the best single-junction devices. The mechanical stack is configured for immediate use in solar arrays and allows interconnections that improve the system end-of-life performance in space.
Lisanti, Mariangela; Mishra-Sharma, Siddharth; Rodd, Nicholas L.; ...
2018-03-09
Dark matter in the halos surrounding galaxy groups and clusters can annihilate to high-energy photons. Recent advancements in the construction of galaxy group catalogs provide many thousands of potential extragalactic targets for dark matter. In this paper, we outline a procedure to infer the dark matter signal associated with a given galaxy group. Applying this procedure to a catalog of sources, one can create a full-sky map of the brightest extragalactic dark matter targets in the nearby Universe (z≲0.03), supplementing sources of dark matter annihilation from within the local group. As with searches for dark matter in dwarf galaxies, thesemore » extragalactic targets can be stacked together to enhance the signals associated with dark matter. We validate this procedure on mock Fermi gamma-ray data sets using a galaxy catalog constructed from the DarkSky N-body cosmological simulation and demonstrate that the limits are robust, at O(1) levels, to systematic uncertainties on halo mass and concentration. We also quantify other sources of systematic uncertainty arising from the analysis and modeling assumptions. Lastly, our results suggest that a stacking analysis using galaxy group catalogs provides a powerful opportunity to discover extragalactic dark matter and complements existing studies of Milky Way dwarf galaxies.« less
Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion
NASA Astrophysics Data System (ADS)
Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.
2017-03-01
Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small image patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.
Cánovas, Carlos Ruiz; Macías, Francisco; Pérez López, Rafael; Nieto, José Miguel
2018-03-15
This paper investigates the mobility and fluxes of REE, Y and Sc under weathering conditions from an anomalously metal-rich phosphogypsum stack in SW Spain. The interactions of the phosphogypsum stack with rainfall and organic matter-rich solutions, simulating the weathering processes observed due to its location on salt-marshes, were simulated by leaching tests (e.g. EN 12457-2 and TCLP). Despite the high concentration of REE, Y and Sc contained in the phosphogypsum stack, their mobility during the leaching tests was very low; <0.66% and 1.8% of the total content of these elements were released during both tests. Chemical and mineralogical evidences suggest that phosphate minerals may act as sources of REE and Y in the phosphogypsum stack while fluoride minerals may act as sinks, controlling their mobility. REE fractionation processes were identified in the phosphogypsum stack; a depletion of LREE in the saturated zone was identified due probably to the dissolution of secondary LREE phosphates previously formed during apatite dissolution in the industrial process. Thus, the vadose zone of the stack would preserve the original REE signature of phosphate rocks. On the other hand, an enrichment of MREE in relation to HREE of edge outflows is observed due to the higher influence of estuarine waters on the leaching process of the phosphogypsum stack. Despite the low mobility of REE, Y and Sc in the phosphogypsum, around 104kg/yr of REE and 40kg/yr of Y and Sc are released from the stack to the estuary, which may imply an environmental concern. The information obtained in this study could be used to optimize extraction methods aimed to recover REE, Y and Sc from phosphogypsum, mitigating the pollution to the environment. Copyright © 2017 Elsevier B.V. All rights reserved.
A Study of Clinically Related Open Source Software Projects
Hogarth, Michael A.; Turner, Stuart
2005-01-01
Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056
Combined Final Report for Colony II Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kale, Laxmikant; Jones, Terry; Moreira, Jose
2013-10-23
(This report was originally submmited by the lead PI (Terry Jones, ORNL) on October 22, 2013 to the program manager, Lucy Nowell. It is being submitted from University of Illinois in accordance with instructions). HPC Colony II seeks to provide portable performance for leadership class machines. Our strategy is based on adaptive system software that aims to make the intelligent decisions necessary to allow domain scientists to safely focus on their task at hand and allow the system software stack to adapt their application to the underlying architecture. This report describes the research undertaken towards these objectives and the resultsmore » obtained over the performance period of the project.« less
Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software
ERIC Educational Resources Information Center
Hemphill, Thomas A.
2005-01-01
This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Pugh, T.; Wyborn, L. A.; Porter, D.; Allen, C.; Smillie, J.; Antony, J.; Trenham, C.; Evans, B. J.; Beckett, D.; Erwin, T.; King, E.; Hodge, J.; Woodcock, R.; Fraser, R.; Lescinsky, D. T.
2014-12-01
The National Computational Infrastructure (NCI) has co-located a priority set of national data assets within a HPC research platform. This powerful in-situ computational platform has been created to help serve and analyse the massive amounts of data across the spectrum of environmental collections - in particular the climate, observational data and geoscientific domains. This paper examines the infrastructure, innovation and opportunity for this significant research platform. NCI currently manages nationally significant data collections (10+ PB) categorised as 1) earth system sciences, climate and weather model data assets and products, 2) earth and marine observations and products, 3) geosciences, 4) terrestrial ecosystem, 5) water management and hydrology, and 6) astronomy, social science and biosciences. The data is largely sourced from the NCI partners (who include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. By co-locating these large valuable data assets, new opportunities have arisen by harmonising the data collections, making a powerful transdisciplinary research platformThe data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. New scientific software, cloud-scale techniques, server-side visualisation and data services have been harnessed and integrated into the platform, so that analysis is performed seamlessly across the traditional boundaries of the underlying data domains. Characterisation of the techniques along with performance profiling ensures scalability of each software component, all of which can either be enhanced or replaced through future improvements. A Development-to-Operations (DevOps) framework has also been implemented to manage the scale of the software complexity alone. This ensures that software is both upgradable and maintainable, and can be readily reused with complexly integrated systems and become part of the growing global trusted community tools for cross-disciplinary research.
The HydroServer Platform for Sharing Hydrologic Data
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.
2010-12-01
The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its parts in advancing hydrologic research. Details of the CUAHSI HIS can be found at http://his.cuahsi.org, and HydroServer codeplex site http://hydroserver.codeplex.com.
TagDigger: user-friendly extraction of read counts from GBS and RAD-seq data.
Clark, Lindsay V; Sacks, Erik J
2016-01-01
In genotyping-by-sequencing (GBS) and restriction site-associated DNA sequencing (RAD-seq), read depth is important for assessing the quality of genotype calls and estimating allele dosage in polyploids. However, existing pipelines for GBS and RAD-seq do not provide read counts in formats that are both accurate and easy to access. Additionally, although existing pipelines allow previously-mined SNPs to be genotyped on new samples, they do not allow the user to manually specify a subset of loci to examine. Pipelines that do not use a reference genome assign arbitrary names to SNPs, making meta-analysis across projects difficult. We created the software TagDigger, which includes three programs for analyzing GBS and RAD-seq data. The first script, tagdigger_interactive.py, rapidly extracts read counts and genotypes from FASTQ files using user-supplied sets of barcodes and tags. Input and output is in CSV format so that it can be opened by spreadsheet software. Tag sequences can also be imported from the Stacks, TASSEL-GBSv2, TASSEL-UNEAK, or pyRAD pipelines, and a separate file can be imported listing the names of markers to retain. A second script, tag_manager.py, consolidates marker names and sequences across multiple projects. A third script, barcode_splitter.py, assists with preparing FASTQ data for deposit in a public archive by splitting FASTQ files by barcode and generating MD5 checksums for the resulting files. TagDigger is open-source and freely available software written in Python 3. It uses a scalable, rapid search algorithm that can process over 100 million FASTQ reads per hour. TagDigger will run on a laptop with any operating system, does not consume hard drive space with intermediate files, and does not require programming skill to use.
The growing need for microservices in bioinformatics.
Williams, Christopher L; Sica, Jeffrey C; Killen, Robert T; Balis, Ulysses G J
2016-01-01
Within the information technology (IT) industry, best practices and standards are constantly evolving and being refined. In contrast, computer technology utilized within the healthcare industry often evolves at a glacial pace, with reduced opportunities for justified innovation. Although the use of timely technology refreshes within an enterprise's overall technology stack can be costly, thoughtful adoption of select technologies with a demonstrated return on investment can be very effective in increasing productivity and at the same time, reducing the burden of maintenance often associated with older and legacy systems. In this brief technical communication, we introduce the concept of microservices as applied to the ecosystem of data analysis pipelines. Microservice architecture is a framework for dividing complex systems into easily managed parts. Each individual service is limited in functional scope, thereby conferring a higher measure of functional isolation and reliability to the collective solution. Moreover, maintenance challenges are greatly simplified by virtue of the reduced architectural complexity of each constitutive module. This fact notwithstanding, rendered overall solutions utilizing a microservices-based approach provide equal or greater levels of functionality as compared to conventional programming approaches. Bioinformatics, with its ever-increasing demand for performance and new testing algorithms, is the perfect use-case for such a solution. Moreover, if promulgated within the greater development community as an open-source solution, such an approach holds potential to be transformative to current bioinformatics software development. Bioinformatics relies on nimble IT framework which can adapt to changing requirements. To present a well-established software design and deployment strategy as a solution for current challenges within bioinformatics. Use of the microservices framework is an effective methodology for the fabrication and implementation of reliable and innovative software, made possible in a highly collaborative setting.
The growing need for microservices in bioinformatics
Williams, Christopher L.; Sica, Jeffrey C.; Killen, Robert T.; Balis, Ulysses G. J.
2016-01-01
Objective: Within the information technology (IT) industry, best practices and standards are constantly evolving and being refined. In contrast, computer technology utilized within the healthcare industry often evolves at a glacial pace, with reduced opportunities for justified innovation. Although the use of timely technology refreshes within an enterprise's overall technology stack can be costly, thoughtful adoption of select technologies with a demonstrated return on investment can be very effective in increasing productivity and at the same time, reducing the burden of maintenance often associated with older and legacy systems. In this brief technical communication, we introduce the concept of microservices as applied to the ecosystem of data analysis pipelines. Microservice architecture is a framework for dividing complex systems into easily managed parts. Each individual service is limited in functional scope, thereby conferring a higher measure of functional isolation and reliability to the collective solution. Moreover, maintenance challenges are greatly simplified by virtue of the reduced architectural complexity of each constitutive module. This fact notwithstanding, rendered overall solutions utilizing a microservices-based approach provide equal or greater levels of functionality as compared to conventional programming approaches. Bioinformatics, with its ever-increasing demand for performance and new testing algorithms, is the perfect use-case for such a solution. Moreover, if promulgated within the greater development community as an open-source solution, such an approach holds potential to be transformative to current bioinformatics software development. Context: Bioinformatics relies on nimble IT framework which can adapt to changing requirements. Aims: To present a well-established software design and deployment strategy as a solution for current challenges within bioinformatics Conclusions: Use of the microservices framework is an effective methodology for the fabrication and implementation of reliable and innovative software, made possible in a highly collaborative setting. PMID:27994937
Scalable Technology for a New Generation of Collaborative Applications
2007-04-01
of the International Symposium on Distributed Computing (DISC), Cracow, Poland, September 2005. Classic Paxos vs. Fast Paxos: Caveat Emptor, Flavio...grou or able and fast multicast primitive to layer under high-level latency across dimensions as varied as group size [10, 17],abstractions such as...servers, networked via fast , dedicated interconnects. The system to subscribe to a fraction of the equities on the software stack running on a single
GOATS 2005 Integrated, Adaptive Autonomous Acoustic Sensing Systems
2008-09-30
the MOOS-Ivp autonomy software suite to support the rapidly growing application community. In addition a structure, nested repository has been...priority. Thus, track messages (when available) are sent most often, but eventually the priority of the status message will grow high enough to get a...data throughput over the old communications stack. 4 Figure 1 Real-time topside display of BTR data transmitted from Unicorn BF21
ERIC Educational Resources Information Center
Pfaffman, Jay
2008-01-01
Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…
A novel application of dielectric stack actuators: a pumping micromixer
NASA Astrophysics Data System (ADS)
Solano-Arana, Susana; Klug, Florian; Mößinger, Holger; Förster-Zügel, Florentine; Schlaak, Helmut F.
2018-07-01
The fabrication of pumping micromixers as a novel application of dielectric stack actuators is proposed in this work. DEA micromixers can be valuable for medical and pharmaceutical applications, due to: firstly, the biocompatibility of the used materials (PDMS and graphite); secondly, the pumping is done with peristaltic movements, allowing only the walls of the channel to be in contact with the liquid, avoiding possible contamination from external parts; and thirdly, the low flow velocity in the micromixers required in many applications. The micromixer based on peristasltic movements will not only mix, but also pump the fluids in and out the device. The developed device is a hybrid micromixer: active, because it needs a voltage source to enhance the quality and speed of the mixing; and passive, with a similar shape to the well-known Y-type micromixers. The proposed micromixer is based on twelve stack actuators distributed in: two pumping chambers, consisting of four stack actuators in series; and a mixing chamber, made of four consecutive stack actuators with 30 layers per stack. The DEA micromixer is able to mix two solutions with a flow rate of 21.5 μl min–1 at the outlet, applying 1500 V at 10 Hz and actuating two actuators at a time.
Equilibrium chemical vapor deposition growth of Bernal-stacked bilayer graphene.
Zhao, Pei; Kim, Sungjin; Chen, Xiao; Einarsson, Erik; Wang, Miao; Song, Yenan; Wang, Hongtao; Chiashi, Shohei; Xiang, Rong; Maruyama, Shigeo
2014-11-25
Using ethanol as the carbon source, self-limiting growth of AB-stacked bilayer graphene (BLG) has been achieved on Cu via an equilibrium chemical vapor deposition (CVD) process. We found that during this alcohol catalytic CVD (ACCVD) a source-gas pressure range exists to break the self-limitation of monolayer graphene on Cu, and at a certain equilibrium state it prefers to form uniform BLG with a high surface coverage of ∼94% and AB-stacking ratio of nearly 100%. More importantly, once the BLG is completed, this growth shows a self-limiting manner, and an extended ethanol flow time does not result in additional layers. We investigate the mechanism of this equilibrium BLG growth using isotopically labeled (13)C-ethanol and selective surface aryl functionalization, and results reveal that during the equilibrium ACCVD process a continuous substitution of graphene flakes occurs to the as-formed graphene and the BLG growth follows a layer-by-layer epitaxy mechanism. These phenomena are significantly in contrast to those observed for previously reported BLG growth using methane as precursor.
NASA Technical Reports Server (NTRS)
Hall, D. H.; Millar, T. W.; Noble, I. A.
1985-01-01
A modeling technique using spherical shell elements and equivalent dipole sources has been applied to Magsat signatures at the Churchill-Superior boundary in Manitoba, Ontario, and Ungava. A large satellite magnetic anomaly (12 nT amplitude) on POGO and Magsat maps near the Churchill-Superior boundary was found to be related to the Richmond Gulf aulacogen. The averaged crustal magnetization in the source region is 5.2 A/m. Stacking of the magnetic traces from Magsat passes reveals a magnetic signature (10 nT amplitude) at the Churchill-Superior boundary in an area studied between 80 deg W and 98 deg W. Modeling suggests a steplike thickening of the crust on the Churchill side of the boundary in a layer with a magnetization of 5 A/m. Signatures on aeromagnetic maps are also found in the source areas for both of these satellite anomalies.
Pregger, Thomas; Friedrich, Rainer
2009-02-01
Emission data needed as input for the operation of atmospheric models should not only be spatially and temporally resolved. Another important feature is the effective emission height which significantly influences modelled concentration values. Unfortunately this information, which is especially relevant for large point sources, is usually not available and simple assumptions are often used in atmospheric models. As a contribution to improve knowledge on emission heights this paper provides typical default values for the driving parameters stack height and flue gas temperature, velocity and flow rate for different industrial sources. The results were derived from an analysis of the probably most comprehensive database of real-world stack information existing in Europe based on German industrial data. A bottom-up calculation of effective emission heights applying equations used for Gaussian dispersion models shows significant differences depending on source and air pollutant and compared to approaches currently used for atmospheric transport modelling.
Toward Wireless Health Monitoring via an Analog Signal Compression-Based Biosensing Platform.
Zhao, Xueyuan; Sadhu, Vidyasagar; Le, Tuan; Pompili, Dario; Javanmard, Mehdi
2018-06-01
Wireless all-analog biosensor design for the concurrent microfluidic and physiological signal monitoring is presented in this paper. The key component is an all-analog circuit capable of compressing two analog sources into one analog signal by the analog joint source-channel coding (AJSCC). Two circuit designs are discussed, including the stacked-voltage-controlled voltage source (VCVS) design with the fixed number of levels, and an improved design, which supports a flexible number of AJSCC levels. Experimental results are presented on the wireless biosensor prototype, composed of printed circuit board realizations of the stacked-VCVS design. Furthermore, circuit simulation and wireless link simulation results are presented on the improved design. Results indicate that the proposed wireless biosensor is well suited for sensing two biological signals simultaneously with high accuracy, and can be applied to a wide variety of low-power and low-cost wireless continuous health monitoring applications.
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2013 CFR
2013-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2010 CFR
2010-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2014 CFR
2014-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2011 CFR
2011-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2012 CFR
2012-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
Hardware for dynamic quantum computing.
Ryan, Colm A; Johnson, Blake R; Ristè, Diego; Donovan, Brian; Ohki, Thomas A
2017-10-01
We describe the hardware, gateware, and software developed at Raytheon BBN Technologies for dynamic quantum information processing experiments on superconducting qubits. In dynamic experiments, real-time qubit state information is fed back or fed forward within a fraction of the qubits' coherence time to dynamically change the implemented sequence. The hardware presented here covers both control and readout of superconducting qubits. For readout, we created a custom signal processing gateware and software stack on commercial hardware to convert pulses in a heterodyne receiver into qubit state assignments with minimal latency, alongside data taking capability. For control, we developed custom hardware with gateware and software for pulse sequencing and steering information distribution that is capable of arbitrary control flow in a fraction of superconducting qubit coherence times. Both readout and control platforms make extensive use of field programmable gate arrays to enable tailored qubit control systems in a reconfigurable fabric suitable for iterative development.
A Software Suite for Testing SpaceWire Devices and Networks
NASA Astrophysics Data System (ADS)
Mills, Stuart; Parkes, Steve
2015-09-01
SpaceWire is a data-handling network for use on-board spacecraft, which connects together instruments, mass-memory, processors, downlink telemetry, and other on-board sub-systems. SpaceWire is simple to implement and has some specific characteristics that help it support data-handling applications in space: high-speed, low-power, simplicity, relatively low implementation cost, and architectural flexibility making it ideal for many space missions. SpaceWire provides high-speed (2 Mbits/s to 200 Mbits/s), bi-directional, full-duplex data-links, which connect together SpaceWire enabled equipment. Data-handling networks can be built to suit particular applications using point-to-point data-links and routing switches. STAR-Dundee’s STAR-System software stack has been designed to meet the needs of engineers designing and developing SpaceWire networks and devices. This paper describes the aims of the software and how those needs were met.
Reuse and Interoperability of Avionics for Space Systems
NASA Technical Reports Server (NTRS)
Hodson, Robert F.
2007-01-01
The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.
Revisiting the Ceara Rise, equatorial Atlantic Ocean: isotope stratigraphy of ODP Leg 154
NASA Astrophysics Data System (ADS)
Wilkens, Roy; Drury, Anna Joy; Westerhold, Thomas; Lyle, Mitchell; Gorgas, Thomas; Tian, Jun
2017-04-01
Isotope stratigraphy has become the method of choice for investigating both past ocean temperatures and global ice volume. Lisiecki and Raymo (2005) published a stacked record of 57 globally distributed benthic δ18O records versus age (LR04 stack). In this study LR04 is compared to high resolution records collected at all of the sites drilled during Ocean Drilling Program (ODP) Leg 154 on the Ceara Rise, in the western equatorial Atlantic Ocean. Newly developed software - the Code for Ocean Drilling Data (CODD) - is used to check data splices of the Ceara sites and better align out-of-splice data with in-splice data. CODD allows to depth and age scaled core images recovered from core table photos enormously facilitating data analysis. The entire splices of ODP Sites 925, 926, 927, 928 and 929 were reviewed. Most changes were minor although several large enough to affect age models based on orbital tuning. We revised the astronomically tuned age model for the Ceara Rise by tuning darker, more clay rich layers to Northern Hemisphere insolation minima. Then we assembled a regional composite benthic stable isotope record from published data. This new Ceara Rise stack provides a new regional reference section for the equatorial Atlantic covering the last 5 million years with an independent age model compared to the non-linear ice volume models of the LR04 stack. Comparison shows that the benthic δ18O composite is consistent with the LR04 stack from 0 - 4 Ma despite a short interval between 1.80 and 1.90 Ma, where LR04 exhibits 2 maxima but where Ceara Rise contains only 1. The interval between 4.0 and 4.5 Ma in the Ceara Rise compilation is decidedly different from LR04, reflecting both the low amplitude of the signal over this interval and the limited amount of data available for the LR04 stack. Our results also point out that precession cycles have been misinterpreted as obliquity in the LR04 stack as suggested by the Ceara Rise composite at 4.2 Ma.
Nurturing reliable and robust open-source scientific software
NASA Astrophysics Data System (ADS)
Uieda, L.; Wessel, P.
2017-12-01
Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo (zenodo.org). However, citations to these sources are not always recognized when computing citation metrics. In summary, the widespread development of reliable and robust open-source software relies on the creation of formal training programs in software development best practices and the recognition of software as a valid form of scholarship.
Determinants of quality, latency, and amount of Stack Overflow answers about recent Android APIs
Filkov, Vladimir
2018-01-01
Stack Overflow is a popular crowdsourced question and answer website for programming-related issues. It is an invaluable resource for software developers; on average, questions posted there get answered in minutes to an hour. Questions about well established topics, e.g., the coercion operator in C++, or the difference between canonical and class names in Java, get asked often in one form or another, and answered very quickly. On the other hand, questions on previously unseen or niche topics take a while to get a good answer. This is particularly the case with questions about current updates to or the introduction of new application programming interfaces (APIs). In a hyper-competitive online market, getting good answers to current programming questions sooner could increase the chances of an app getting released and used. So, can developers anyhow, e.g., hasten the speed to good answers to questions about new APIs? Here, we empirically study Stack Overflow questions pertaining to new Android APIs and their associated answers. We contrast the interest in these questions, their answer quality, and timeliness of their answers to questions about old APIs. We find that Stack Overflow answerers in general prioritize with respect to currentness: questions about new APIs do get more answers, but good quality answers take longer. We also find that incentives in terms of question bounties, if used appropriately, can significantly shorten the time and increase answer quality. Interestingly, no operationalization of bounty amount shows significance in our models. In practice, our findings confirm the value of bounties in enhancing expert participation. In addition, they show that the Stack Overflow style of crowdsourcing, for all its glory in providing answers about established programming knowledge, is less effective with new API questions. PMID:29547620
Determinants of quality, latency, and amount of Stack Overflow answers about recent Android APIs.
Kavaler, David; Filkov, Vladimir
2018-01-01
Stack Overflow is a popular crowdsourced question and answer website for programming-related issues. It is an invaluable resource for software developers; on average, questions posted there get answered in minutes to an hour. Questions about well established topics, e.g., the coercion operator in C++, or the difference between canonical and class names in Java, get asked often in one form or another, and answered very quickly. On the other hand, questions on previously unseen or niche topics take a while to get a good answer. This is particularly the case with questions about current updates to or the introduction of new application programming interfaces (APIs). In a hyper-competitive online market, getting good answers to current programming questions sooner could increase the chances of an app getting released and used. So, can developers anyhow, e.g., hasten the speed to good answers to questions about new APIs? Here, we empirically study Stack Overflow questions pertaining to new Android APIs and their associated answers. We contrast the interest in these questions, their answer quality, and timeliness of their answers to questions about old APIs. We find that Stack Overflow answerers in general prioritize with respect to currentness: questions about new APIs do get more answers, but good quality answers take longer. We also find that incentives in terms of question bounties, if used appropriately, can significantly shorten the time and increase answer quality. Interestingly, no operationalization of bounty amount shows significance in our models. In practice, our findings confirm the value of bounties in enhancing expert participation. In addition, they show that the Stack Overflow style of crowdsourcing, for all its glory in providing answers about established programming knowledge, is less effective with new API questions.
Piestrup, Melvin A.; Boyers, David G.; Pincus, Cary I.; Maccagno, Pierre
1990-01-01
An intense, relatively inexpensive X-ray source (as compared to a synchrotron emitter) for technological, scientific, and spectroscopic purposes. A conical radiation pattern produced by a single foil or stack of foils is focused by optics to increase the intensity of the radiation at a distance from the conical radiator.
Walter, Carl E.; Van Konynenburg, Richard; VanSant, James H.
1992-01-01
An isotopic heat source is formed using stacks of thin individual layers of a refractory isotopic fuel, preferably thulium oxide, alternating with layers of a low atomic weight diluent, preferably graphite. The graphite serves several functions: to act as a moderator during neutron irradiation, to minimize bremsstrahlung radiation, and to facilitate heat transfer. The fuel stacks are inserted into a heat block, which is encased in a sealed, insulated and shielded structural container. Heat pipes are inserted in the heat block and contain a working fluid. The heat pipe working fluid transfers heat from the heat block to a heat exchanger for power conversion. Single phase gas pressure controls the flow of the working fluid for maximum heat exchange and to provide passive cooling.
Method for using global optimization to the estimation of surface-consistent residual statics
Reister, David B.; Barhen, Jacob; Oblow, Edward M.
2001-01-01
An efficient method for generating residual statics corrections to compensate for surface-consistent static time shifts in stacked seismic traces. The method includes a step of framing the residual static corrections as a global optimization problem in a parameter space. The method also includes decoupling the global optimization problem involving all seismic traces into several one-dimensional problems. The method further utilizes a Stochastic Pijavskij Tunneling search to eliminate regions in the parameter space where a global minimum is unlikely to exist so that the global minimum may be quickly discovered. The method finds the residual statics corrections by maximizing the total stack power. The stack power is a measure of seismic energy transferred from energy sources to receivers.
Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne
2011-09-28
Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.
An Analysis of Open Source Security Software Products Downloads
ERIC Educational Resources Information Center
Barta, Brian J.
2014-01-01
Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…
AIC-based diffraction stacking for local earthquake locations at the Sumatran Fault (Indonesia)
NASA Astrophysics Data System (ADS)
Hendriyana, Andri; Bauer, Klaus; Muksin, Umar; Weber, Michael
2018-05-01
We present a new workflow for the localization of seismic events which is based on a diffraction stacking approach. In order to address the effects from complex source radiation patterns, we suggest to compute diffraction stacking from a characteristic function (CF) instead of stacking the original waveform data. A new CF, which is called in the following mAIC (modified from Akaike Information Criterion) is proposed. We demonstrate that both P- and S-wave onsets can be detected accurately. To avoid cross-talk between P and S waves due to inaccurate velocity models, we separate the P and S waves from the mAIC function by making use of polarization attributes. Then, the final image function is represented by the largest eigenvalue as a result of the covariance analysis between P- and S-image functions. Results from synthetic experiments show that the proposed diffraction stacking provides reliable results. The workflow of the diffraction stacking method was finally applied to local earthquake data from Sumatra, Indonesia. Recordings from a temporary network of 42 stations deployed for nine months around the Tarutung pull-apart basin were analysed. The seismic event locations resulting from the diffraction stacking method align along a segment of the Sumatran Fault. A more complex distribution of seismicity is imaged within and around the Tarutung basin. Two lineaments striking N-S were found in the centre of the Tarutung basin which support independent results from structural geology.
Open Source Molecular Modeling
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-01-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126
X-ray Obscured AGN in the GOODS-N
NASA Astrophysics Data System (ADS)
Georgantopoulos, I.; Akylas, A.; Rovilos, E.; Xilouris, E.
2010-07-01
We explore the X-ray properties of the Dust Obscured Galaxies (DOGs) i.e. sources with f24μ / fR > 1000. This population has been proposed to contain a significant fraction of Compton-thick sources at high redshift. In particular we study the X-ray spectra of the 14 DOGS detected in the CDFN 2Ms exposure. Their stacked spectrum is flat with Γ=1±0.1 very similar to the stacked spectrum of the undetected DOGs (Γ=0.8±0.2). However, most of our X-ray detected DOGs present only moderate absorption with column densities 1022 < NH < 1024 cm-2. Only three sources (20%) present very flat spectra and are probably associated with reflection dominated Compton-thick sources. Our finding is rather at odds with papers which claim that the vast majority of DOGs are associated with Compton-thick sources. In any case, such sources at high redshift (z > 2) present limited interest for the X-ray background: the population synthesis models predict a contribution, for the z > 2 Compton-thick AGN, to the X-ray background flux at 30 keV, of less than 1 percent.
NASA Astrophysics Data System (ADS)
Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad
2016-12-01
3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.
NASA Technical Reports Server (NTRS)
Boubel, Richard W.
1971-01-01
The stack sampler described in this paper has been developed to overcome the difficulties of particulate sampling with presently available equipment. Its use on emissions from hog fuel fired boilers, back-fired incinerators, wigwam burners, asphalt plants, and seed cleaning cyclones is reported. The results indicate that the sampler is rapid and reliable in its use. It is relatively simple and inexpensive to operate. For most sources it should be considered over the more complicated and expensive sampling trains being used and specified.
NASA Astrophysics Data System (ADS)
Moody, D.; Brumby, S. P.; Chartrand, R.; Franco, E.; Keisler, R.; Kelton, T.; Kontgis, C.; Mathis, M.; Raleigh, D.; Rudelis, X.; Skillman, S.; Warren, M. S.; Longbotham, N.
2016-12-01
The recent computing performance revolution has driven improvements in sensor, communication, and storage technology. Historical, multi-decadal remote sensing datasets at the petabyte scale are now available in commercial clouds, with new satellite constellations generating petabytes per year of high-resolution imagery with daily global coverage. Cloud computing and storage, combined with recent advances in machine learning and open software, are enabling understanding of the world at an unprecedented scale and detail. We have assembled all available satellite imagery from the USGS Landsat, NASA MODIS, and ESA Sentinel programs, as well as commercial PlanetScope and RapidEye imagery, and have analyzed over 2.8 quadrillion multispectral pixels. We leveraged the commercial cloud to generate a tiled, spatio-temporal mosaic of the Earth for fast iteration and development of new algorithms combining analysis techniques from remote sensing, machine learning, and scalable compute infrastructure. Our data platform enables processing at petabytes per day rates using multi-source data to produce calibrated, georeferenced imagery stacks at desired points in time and space that can be used for pixel level or global scale analysis. We demonstrate our data platform capability by using the European Space Agency's (ESA) published 2006 and 2009 GlobCover 20+ category label maps to train and test a Land Cover Land Use (LCLU) classifier, and generate current self-consistent LCLU maps in Brazil. We train a standard classifier on 2006 GlobCover categories using temporal imagery stacks, and we validate our results on co-registered 2009 Globcover LCLU maps and 2009 imagery. We then extend the derived LCLU model to current imagery stacks to generate an updated, in-season label map. Changes in LCLU labels can now be seamlessly monitored for a given location across the years in order to track, for example, cropland expansion, forest growth, and urban developments. An example of change monitoring is illustrated in the included figure showing rainfed cropland change in the Mato Grosso region of Brazil between 2006 and 2009.
40 CFR 52.1225 - Review of new sources and modifications.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...
40 CFR 52.1225 - Review of new sources and modifications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...
40 CFR 52.1225 - Review of new sources and modifications.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...
40 CFR 52.1225 - Review of new sources and modifications.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...
40 CFR 52.1225 - Review of new sources and modifications.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Minnesota § 52.1225 Review of new sources and modifications. (a) Part D—Approval. The State of Minnesota has satisfied the... nonattainment areas. (b)-(d) [Reserved] (e) The State of Minnesota has committed to conform to the Stack Height...
Code of Federal Regulations, 2010 CFR
2010-07-01
... section 183(f) of the Act; (11) Any standard or other requirement of the program to control air pollution... emissions which could not reasonably pass through a stack, chimney, vent, or other functionally-equivalent... means any stationary source (or any group of stationary sources that are located on one or more...
Piestrup, M.A.; Boyers, D.G.; Pincus, C.I.; Maccagno, P.
1990-08-21
Disclosed is an intense, relatively inexpensive X-ray source (as compared to a synchrotron emitter) for technological, scientific, and spectroscopic purposes. A conical radiation pattern produced by a single foil or stack of foils is focused by optics to increase the intensity of the radiation at a distance from the conical radiator. 8 figs.
Multi-Purpose, Application-Centric, Scalable I/O Proxy Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, M. C.
2015-06-15
MACSio is a Multi-purpose, Application-Centric, Scalable I/O proxy application. It is designed to support a number of goals with respect to parallel I/O performance testing and benchmarking including the ability to test and compare various I/O libraries and I/O paradigms, to predict scalable performance of real applications and to help identify where improvements in I/O performance can be made within the HPC I/O software stack.
Annual Expeditionary Warfare Conference (22nd)
2017-10-24
for many copies of data and a unique software stack to operate on it. PSI designs and manufactures shipping and carrying cases as well as providing...An ISO 9001:2008 rated company, Trijicon Inc., is committed to Customer Satisfaction through the design , development, and manufacture of superior...their endeavors to continue as a world leader in the design and manufacture of high quality, innovative sighting systems. TABLE TOP dISPLAyErS 12 LtGen
Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition
2017-01-01
004 OFFICE OF NAVAL RESEARCH ATTN JASON STACK MINE WARFARE & OCEAN ENGINEERING PROGRAMS CODE 32, SUITE 1092 875 N RANDOLPH ST ARLINGTON VA 22203 ONR...naval mine countermeasures (MCM) operations by automating a large portion of the data analysis. Successful long-term implementation of ATR requires a...Modular Algorithm Testbed Suite; MATS; Mine Countermeasures Operations U U U SAR 24 Derek R. Kolacinski (850) 230-7218 THIS PAGE INTENTIONALLY LEFT
Fast assembling of neuron fragments in serial 3D sections.
Chen, Hanbo; Iascone, Daniel Maxim; da Costa, Nuno Maçarico; Lein, Ed S; Liu, Tianming; Peng, Hanchuan
2017-09-01
Reconstructing neurons from 3D image-stacks of serial sections of thick brain tissue is very time-consuming and often becomes a bottleneck in high-throughput brain mapping projects. We developed NeuronStitcher, a software suite for stitching non-overlapping neuron fragments reconstructed in serial 3D image sections. With its efficient algorithm and user-friendly interface, NeuronStitcher has been used successfully to reconstruct very large and complex human and mouse neurons.
2017-03-07
Integrating multiple sources of pharmacovigilance evidence has the potential to advance the science of safety signal detection and evaluation. In this regard, there is a need for more research on how to integrate multiple disparate evidence sources while making the evidence computable from a knowledge representation perspective (i.e., semantic enrichment). Existing frameworks suggest well-promising outcomes for such integration but employ a rather limited number of sources. In particular, none have been specifically designed to support both regulatory and clinical use cases, nor have any been designed to add new resources and use cases through an open architecture. This paper discusses the architecture and functionality of a system called Large-scale Adverse Effects Related to Treatment Evidence Standardization (LAERTES) that aims to address these shortcomings. LAERTES provides a standardized, open, and scalable architecture for linking evidence sources relevant to the association of drugs with health outcomes of interest (HOIs). Standard terminologies are used to represent different entities. For example, drugs and HOIs are represented in RxNorm and Systematized Nomenclature of Medicine -- Clinical Terms respectively. At the time of this writing, six evidence sources have been loaded into the LAERTES evidence base and are accessible through prototype evidence exploration user interface and a set of Web application programming interface services. This system operates within a larger software stack provided by the Observational Health Data Sciences and Informatics clinical research framework, including the relational Common Data Model for observational patient data created by the Observational Medical Outcomes Partnership. Elements of the Linked Data paradigm facilitate the systematic and scalable integration of relevant evidence sources. The prototype LAERTES system provides useful functionality while creating opportunities for further research. Future work will involve improving the method for normalizing drug and HOI concepts across the integrated sources, aggregated evidence at different levels of a hierarchy of HOI concepts, and developing more advanced user interface for drug-HOI investigations.
OSIRIX: open source multimodality image navigation software
NASA Astrophysics Data System (ADS)
Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman
2005-04-01
The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/
Faint Object Detection in Multi-Epoch Observations via Catalog Data Fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budavári, Tamás; Szalay, Alexander S.; Loredo, Thomas J.
Astronomy in the time-domain era faces several new challenges. One of them is the efficient use of observations obtained at multiple epochs. The work presented here addresses faint object detection and describes an incremental strategy for separating real objects from artifacts in ongoing surveys. The idea is to produce low-threshold single-epoch catalogs and to accumulate information across epochs. This is in contrast to more conventional strategies based on co-added or stacked images. We adopt a Bayesian approach, addressing object detection by calculating the marginal likelihoods for hypotheses asserting that there is no object or one object in a small imagemore » patch containing at most one cataloged source at each epoch. The object-present hypothesis interprets the sources in a patch at different epochs as arising from a genuine object; the no-object hypothesis interprets candidate sources as spurious, arising from noise peaks. We study the detection probability for constant-flux objects in a Gaussian noise setting, comparing results based on single and stacked exposures to results based on a series of single-epoch catalog summaries. Our procedure amounts to generalized cross-matching: it is the product of a factor accounting for the matching of the estimated fluxes of the candidate sources and a factor accounting for the matching of their estimated directions. We find that probabilistic fusion of multi-epoch catalogs can detect sources with similar sensitivity and selectivity compared to stacking. The probabilistic cross-matching framework underlying our approach plays an important role in maintaining detection sensitivity and points toward generalizations that could accommodate variability and complex object structure.« less
High-frequency self-aligned graphene transistors with transferred gate stacks
Cheng, Rui; Bai, Jingwei; Liao, Lei; Zhou, Hailong; Chen, Yu; Liu, Lixin; Lin, Yung-Chen; Jiang, Shan; Huang, Yu; Duan, Xiangfeng
2012-01-01
Graphene has attracted enormous attention for radio-frequency transistor applications because of its exceptional high carrier mobility, high carrier saturation velocity, and large critical current density. Herein we report a new approach for the scalable fabrication of high-performance graphene transistors with transferred gate stacks. Specifically, arrays of gate stacks are first patterned on a sacrificial substrate, and then transferred onto arbitrary substrates with graphene on top. A self-aligned process, enabled by the unique structure of the transferred gate stacks, is then used to position precisely the source and drain electrodes with minimized access resistance or parasitic capacitance. This process has therefore enabled scalable fabrication of self-aligned graphene transistors with unprecedented performance including a record-high cutoff frequency up to 427 GHz. Our study defines a unique pathway to large-scale fabrication of high-performance graphene transistors, and holds significant potential for future application of graphene-based devices in ultra–high-frequency circuits. PMID:22753503
ERIC Educational Resources Information Center
Simpson, James Daniel
2014-01-01
Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…
Zhu, Xiuping; Kim, Taeyoung; Rahimi, Mohammad; Gorski, Christopher A; Logan, Bruce E
2017-02-22
Salinity gradient energy can be directly converted into electrical power by using reverse electrodialysis (RED) and other technologies, but reported power densities have been too low for practical applications. Herein, the RED stack performance was improved by using 2,6-dihydroxyanthraquinone and ferrocyanide as redox couples. These electrolytes were then used in a flow battery to produce an integrated RED stack and flow battery (RED-FB) system capable of capturing, storing, and discharging salinity gradient energy. Energy captured from the RED stack was discharged in the flow battery at a maximum power density of 3.0 kW m -2 -anode, which was similar to the flow batteries charged by electrical power and could be used for practical applications. Salinity gradient energy captured from the RED stack was recovered from the electrolytes as electricity with 30 % efficiency, and the maximum energy density of the system was 2.4 kWh m -3 -anolyte. The combined RED-FB system overcomes many limitations of previous approaches to capture, store, and use salinity gradient energy from natural or engineered sources. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Open-source software: not quite endsville.
Stahl, Matthew T
2005-02-01
Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.
Preparing a scientific manuscript in Linux: Today's possibilities and limitations.
Tchantchaleishvili, Vakhtang; Schmitto, Jan D
2011-10-22
Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux.
Open source IPSEC software in manned and unmanned space missions
NASA Astrophysics Data System (ADS)
Edwards, Jacob
Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.
2011-01-01
Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914
Robust image alignment for cryogenic transmission electron microscopy.
McLeod, Robert A; Kowal, Julia; Ringler, Philippe; Stahlberg, Henning
2017-03-01
Cryo-electron microscopy recently experienced great improvements in structure resolution due to direct electron detectors with improved contrast and fast read-out leading to single electron counting. High frames rates enabled dose fractionation, where a long exposure is broken into a movie, permitting specimen drift to be registered and corrected. The typical approach for image registration, with high shot noise and low contrast, is multi-reference (MR) cross-correlation. Here we present the software package Zorro, which provides robust drift correction for dose fractionation by use of an intensity-normalized cross-correlation and logistic noise model to weight each cross-correlation in the MR model and filter each cross-correlation optimally. Frames are reliably registered by Zorro with low dose and defocus. Methods to evaluate performance are presented, by use of independently-evaluated even- and odd-frame stacks by trajectory comparison and Fourier ring correlation. Alignment of tiled sub-frames is also introduced, and demonstrated on an example dataset. Zorro source code is available at github.com/CINA/zorro. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hongqi, Jing; Li, Zhong; Yuxi, Ni; Junjie, Zhang; Suping, Liu; Xiaoyu, Ma
2015-10-01
A novel high-efficiency cooling mini-channel heat-sink structure has been designed to meet the package technology demands of high power density laser diode array stacks. Thermal and water flowing characteristics have been simulated using the Ansys-Fluent software. Owing to the increased effective cooling area, this mini-channel heat-sink structure has a better cooling effect when compared with the traditional macro-channel heat-sinks. Owing to the lower flow velocity in this novel high efficient cooling structure, the chillers' water-pressure requirement is reduced. Meanwhile, the machining process of this high-efficiency cooling mini-channel heat-sink structure is simple and the cost is relatively low, it also has advantages in terms of high durability and long lifetime. This heat-sink is an ideal choice for the package of high power density laser diode array stacks. Project supported by the Defense Industrial Technology Development Program (No. B1320133033).
The social disutility of software ownership.
Douglas, David M
2011-09-01
Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.
CheMentor Software System by H. A. Peoples
NASA Astrophysics Data System (ADS)
Reid, Brian P.
1997-09-01
CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.
NASA Astrophysics Data System (ADS)
Dryzek, Jerzy; Siemek, Krzysztof
2013-08-01
The spatial distribution of positrons emitted from radioactive isotopes into stacks or layered samples is a subject of the presented report. It was found that Monte Carlo (MC) simulations using GEANT4 code are not able to describe correctly the experimental data of the positron fractions in stacks. The mathematical model was proposed for calculations of the implantation profile or positron fractions in separated layers or foils being components of a stack. The model takes into account only two processes, i.e., the positron absorption and backscattering at interfaces. The mathematical formulas were applied in the computer program called LYS-1 (layers profile analysis). The theoretical predictions of the model were in the good agreement with the results of the MC simulations for the semi infinite sample. The experimental verifications of the model were performed on the symmetrical and non-symmetrical stacks of different foils. The good agreement between the experimental and calculated fractions of positrons in components of a stack was achieved. Also the experimental implantation profile obtained using the depth scanning of positron implantation technique is very well described by the theoretical profile obtained within the proposed model. The LYS-1 program allows us also to calculate the fraction of positrons which annihilate in the source, which can be useful in the positron spectroscopy.
Enabling Diverse Software Stacks on Supercomputers using High Performance Virtual Clusters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younge, Andrew J.; Pedretti, Kevin; Grant, Ryan
While large-scale simulations have been the hallmark of the High Performance Computing (HPC) community for decades, Large Scale Data Analytics (LSDA) workloads are gaining attention within the scientific community not only as a processing component to large HPC simulations, but also as standalone scientific tools for knowledge discovery. With the path towards Exascale, new HPC runtime systems are also emerging in a way that differs from classical distributed com- puting models. However, system software for such capabilities on the latest extreme-scale DOE supercomputing needs to be enhanced to more appropriately support these types of emerging soft- ware ecosystems. In thismore » paper, we propose the use of Virtual Clusters on advanced supercomputing resources to enable systems to support not only HPC workloads, but also emerging big data stacks. Specifi- cally, we have deployed the KVM hypervisor within Cray's Compute Node Linux on a XC-series supercomputer testbed. We also use libvirt and QEMU to manage and provision VMs directly on compute nodes, leveraging Ethernet-over-Aries network emulation. To our knowledge, this is the first known use of KVM on a true MPP supercomputer. We investigate the overhead our solution using HPC benchmarks, both evaluating single-node performance as well as weak scaling of a 32-node virtual cluster. Overall, we find single node performance of our solution using KVM on a Cray is very efficient with near-native performance. However overhead increases by up to 20% as virtual cluster size increases, due to limitations of the Ethernet-over-Aries bridged network. Furthermore, we deploy Apache Spark with large data analysis workloads in a Virtual Cluster, ef- fectively demonstrating how diverse software ecosystems can be supported by High Performance Virtual Clusters.« less
The Open Source Teaching Project (OSTP): Research Note.
ERIC Educational Resources Information Center
Hirst, Tony
The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…
Free and open source software for the manipulation of digital images.
Solomon, Robert W
2009-06-01
Free and open source software is a type of software that is nearly as powerful as commercial software but is freely downloadable. This software can do almost everything that the expensive programs can. GIMP (gnu image manipulation program) is the free program that is comparable to Photoshop, and versions are available for Windows, Macintosh, and Linux platforms. This article briefly describes how GIMP can be installed and used to manipulate radiology images. It is no longer necessary to budget large amounts of money for high-quality software to achieve the goals of image processing and document creation because free and open source software is available for the user to download at will.
Integrating open-source software applications to build molecular dynamics systems.
Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej
2014-04-05
Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.
Light source comprising a common substrate, a first led device and a second led device
Choong, Vi-En
2010-02-23
At least one stacked organic or polymeric light emitting diode (PLEDs) devices to comprise a light source is disclosed. At least one of the PLEDs includes a patterned cathode which has regions which transmit light. The patterned cathodes enable light emission from the PLEDs to combine together. The light source may be top or bottom emitting or both.
Ground Truth Events with Source Geometry in Eurasia and the Middle East
2016-06-02
source properties, including seismic moment, corner frequency, radiated energy , and stress drop have been obtained using spectra for S waves following...PARAMETERS Other source parameters, including radiated energy , corner frequency, seismic moment, and static stress drop were calculated using a spectral...technique (Richardson & Jordan, 2002; Andrews, 1986). The process entails separating event and station spectra and median- stacking each event’s
ERIC Educational Resources Information Center
Kamthan, Pankaj
2007-01-01
Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…
76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... be held in the General Services Administration (GSA), Central Office Auditorium, 1800 F Street NW...
Open Source Software Development and Lotka's Law: Bibliometric Patterns in Programming.
ERIC Educational Resources Information Center
Newby, Gregory B.; Greenberg, Jane; Jones, Paul
2003-01-01
Applies Lotka's Law to metadata on open source software development. Authoring patterns found in software development productivity are found to be comparable to prior studies of Lotka's Law for scientific and scholarly publishing, and offer promise in predicting aggregate behavior of open source developers. (Author/LRW)
Atmosphere-entry behavior of a modular, disk-shaped, isotope heat source.
NASA Technical Reports Server (NTRS)
Vorreiter, J. W.; Pitts, W. C.; Stine, H. A.; Burns, J. J.
1973-01-01
The authors have studied the entry and impact behavior of an isotope heat source for space nuclear power that disassembles into a number of modules which would enter the earth's atmosphere separately if a flight aborted. These modules are disk-shaped units, each with its own reentry heat shield and protective impact container. In normal operation, the disk modules are stacked inside the generator, but during a reentry abort they separate and fly as individual units of low ballistic coefficient. Flight tests at hypersonic speeds have confirmed that a stack of disks will separate and assume a flat-forward mode of flight. Free-fall tests of single disks have demonstrated a nominal impact velocity of 30 m/sec at sea level for a practical range of ballistic coefficients.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Analysis of radiation safety for Small Modular Reactor (SMR) on PWR-100 MWe type
NASA Astrophysics Data System (ADS)
Udiyani, P. M.; Husnayani, I.; Deswandri; Sunaryo, G. R.
2018-02-01
Indonesia as an archipelago country, including big, medium and small islands is suitable to construction of Small Medium/Modular reactors. Preliminary technology assessment on various SMR has been started, indeed the SMR is grouped into Light Water Reactor, Gas Cooled Reactor, and Solid Cooled Reactor and from its site it is group into Land Based reactor and Water Based Reactor. Fukushima accident made people doubt about the safety of Nuclear Power Plant (NPP), which impact on the public perception of the safety of nuclear power plants. The paper will describe the assessment of safety and radiation consequences on site for normal operation and Design Basis Accident postulation of SMR based on PWR-100 MWe in Bangka Island. Consequences of radiation for normal operation simulated for 3 units SMR. The source term was generated from an inventory by using ORIGEN-2 software and the consequence of routine calculated by PC-Cream and accident by PC Cosyma. The adopted methodology used was based on site-specific meteorological and spatial data. According to calculation by PC-CREAM 08 computer code, the highest individual dose in site area for adults is 5.34E-02 mSv/y in ESE direction within 1 km distance from stack. The result of calculation is that doses on public for normal operation below 1mSv/y. The calculation result from PC Cosyma, the highest individual dose is 1.92.E+00 mSv in ESE direction within 1km distance from stack. The total collective dose (all pathway) is 3.39E-01 manSv, with dominant supporting from cloud pathway. Results show that there are no evacuation countermeasure will be taken based on the regulation of emergency.
Preparing a scientific manuscript in Linux: Today's possibilities and limitations
2011-01-01
Background Increasing number of scientists are enthusiastic about using free, open source software for their research purposes. Authors' specific goal was to examine whether a Linux-based operating system with open source software packages would allow to prepare a submission-ready scientific manuscript without the need to use the proprietary software. Findings Preparation and editing of scientific manuscripts is possible using Linux and open source software. This letter to the editor describes key steps for preparation of a publication-ready scientific manuscript in a Linux-based operating system, as well as discusses the necessary software components. This manuscript was created using Linux and open source programs for Linux. PMID:22018246
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gottesfeld, S.
The fuel cell is the most efficient device for the conversion of hydrogen fuel to electric power. As such, the fuel cell represents a key element in efforts to demonstrate and implement hydrogen fuel utilization for electric power generation. The low temperature, polymer electrolyte membrane fuel cell (PEMFC) has recently been identified as an attractive option for stationary power generation, based on the relatively simple and benign materials employed, the zero-emission character of the device, and the expected high power density, high reliability and low cost. However, a PEMFC stack fueled by hydrogen with the combined properties of low cost,more » high performance and high reliability has not yet been demonstrated. Demonstration of such a stack will remove a significant barrier to implementation of this advanced technology for electric power generation from hydrogen. Work done in the past at LANL on the development of components and materials, particularly on advanced membrane/electrode assemblies (MEAs), has contributed significantly to the capability to demonstrate in the foreseeable future a PEMFC stack with the combined characteristics described above. A joint effort between LANL and an industrial stack manufacturer will result in the demonstration of such a fuel cell stack for stationary power generation. The stack could operate on hydrogen fuel derived from either natural gas or from renewable sources. The technical plan includes collaboration with a stack manufacturer (CRADA). It stresses the special requirements from a PEMFC in stationary power generation, particularly maximization of the energy conversion efficiency, extension of useful life to the 10 hours time scale and tolerance to impurities from the reforming of natural gas.« less
Positron source position sensing detector and electronics
Burnham, Charles A.; Bradshaw, Jr., John F.; Kaufman, David E.; Chesler, David A.; Brownell, Gordon L.
1985-01-01
A positron source, position sensing device, particularly with medical applications, in which positron induced gamma radiation is detected using a ring of stacked, individual scintillation crystals, a plurality of photodetectors, separated from the scintillation crystals by a light guide, and high resolution position interpolation electronics. Preferably the scintillation crystals are several times more numerous than the photodetectors with each crystal being responsible for a single scintillation event from a received gamma ray. The light guide will disperse the light emitted from gamma ray absorption over several photodetectors. Processing electronics for the output of the photodetectors resolves the location of the scintillation event to a fraction of the dimension of each photodetector. Because each positron absorption results in two 180.degree. oppositely traveling gamma rays, the detection of scintillation in pairs permits location of the positron source in a manner useful for diagnostic purposes. The processing electronics simultaneously responds to the outputs of the photodetectors to locate the scintillations to the source crystal. While it is preferable that the scintillation crystal include a plurality of stacked crystal elements, the resolving power of the processing electronics is also applicable to continuous crystal scintillators.
Open source software to control Bioflo bioreactors.
Burdge, David A; Libourel, Igor G L
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.
Open Source Software to Control Bioflo Bioreactors
Burdge, David A.; Libourel, Igor G. L.
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828
Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives
NASA Astrophysics Data System (ADS)
Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.
2017-12-01
During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.
NASA Astrophysics Data System (ADS)
Allen, Alice; Teuben, Peter J.; Ryan, P. Wesley
2018-05-01
We examined software usage in a sample set of astrophysics research articles published in 2015 and searched for the source codes for the software mentioned in these research papers. We categorized the software to indicate whether the source code is available for download and whether there are restrictions to accessing it, and if the source code is not available, whether some other form of the software, such as a binary, is. We also extracted hyperlinks from one journal’s 2015 research articles, as links in articles can serve as an acknowledgment of software use and lead to the data used in the research, and tested them to determine which of these URLs are still accessible. For our sample of 715 software instances in the 166 articles we examined, we were able to categorize 418 records as according to whether source code was available and found that 285 unique codes were used, 58% of which offered the source code for download. Of the 2558 hyperlinks extracted from 1669 research articles, at best, 90% of them were available over our testing period.
Rey-Martinez, Jorge; Pérez-Fernández, Nicolás
2016-12-01
The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.
ERIC Educational Resources Information Center
Ge, Xun; Huang, Kun; Dong, Yifei
2010-01-01
A semester-long ethnography study was carried out to investigate project-based learning in a graduate software engineering course through the implementation of an Open-Source Software Development (OSSD) learning environment, which featured authentic projects, learning community, cognitive apprenticeship, and technology affordances. The study…
Asad, A H; Chan, S; Cryer, D; Burrage, J W; Siddiqui, S A; Price, R I
2015-11-01
The proton beam energy of an isochronous 18MeV cyclotron was determined using a novel version of the stacked copper-foils technique. This simple method used stacked foils of natural copper forming 'thick' targets to produce Zn radioisotopes by the well-documented (p,x) monitor-reactions. Primary beam energy was calculated using the (65)Zn activity vs. depth profile in the target, with the results obtained using (62)Zn and (63)Zn (as comparators) in close agreement. Results from separate measurements using foil thicknesses of 100, 75, 50 or 25µm to form the stacks also concurred closely. Energy was determined by iterative least-squares comparison of the normalized measured activity profile in a target-stack with the equivalent calculated normalized profile, using 'energy' as the regression variable. The technique exploits the uniqueness of the shape of the activity vs. depth profile of the monitor isotope in the target stack for a specified incident energy. The energy using (65)Zn activity profiles and 50-μm foils alone was 18.03±0.02 [SD] MeV (95%CI=17.98-18.08), and 18.06±0.12MeV (95%CI=18.02-18.10; NS) when combining results from all isotopes and foil thicknesses. When the beam energy was re-measured using (65)Zn and 50-μm foils only, following a major upgrade of the ion sources and nonmagnetic beam controls the results were 18.11±0.05MeV (95%CI=18.00-18.23; NS compared with 'before'). Since measurement of only one Zn monitor isotope is required to determine the normalized activity profile this indirect yet precise technique does not require a direct beam-current measurement or a gamma-spectroscopy efficiency calibrated with standard sources, though a characteristic photopeak must be identified. It has some advantages over published methods using the ratio of cross sections of monitor reactions, including the ability to determine energies across a broader range and without need for customized beam degraders. Copyright © 2015 Elsevier Ltd. All rights reserved.
Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J
2004-09-24
Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gwyn, Stephen D. J., E-mail: Stephen.Gwyn@nrc-cnrc.gc.ca
This paper describes the image stacks and catalogs of the Canada-France-Hawaii Telescope Legacy Survey produced using the MegaPipe data pipeline at the Canadian Astronomy Data Centre. The Legacy Survey is divided into two parts. The Deep Survey consists of four fields each of 1 deg{sup 2}, with magnitude limits (50% completeness for point sources) of u = 27.5, g = 27.9, r = 27.7, i = 27.4, and z = 26.2. It contains 1.6 Multiplication-Sign 10{sup 6} sources. The Wide Survey consists of 150 deg{sup 2} split over four fields, with magnitude limits of u = 26.0, g = 26.5,more » r = 25.9, i = 25.7, and z = 24.6. It contains 3 Multiplication-Sign 10{sup 7} sources. This paper describes the calibration, image stacking, and catalog generation process. The images and catalogs are available on the web through several interfaces: normal image and text file catalog downloads, a 'Google Sky' interface, an image cutout service, and a catalog database query service.« less
The Visible Human Data Sets (VHD) and Insight Toolkit (ITk): Experiments in Open Source Software
Ackerman, Michael J.; Yoo, Terry S.
2003-01-01
From its inception in 1989, the Visible Human Project was designed as an experiment in open source software. In 1994 and 1995 the male and female Visible Human data sets were released by the National Library of Medicine (NLM) as open source data sets. In 2002 the NLM released the first version of the Insight Toolkit (ITk) as open source software. PMID:14728278
NASA Technical Reports Server (NTRS)
Fogelson, S. A.; Chait, I. L.; Bradley, W. J.; Benson, W.
1980-01-01
Detailed capital cost estimates for the ECAS and modified reference plants in mid-1978 dollars for both 250 and 175 F (394 and 353 K) stack gas reheat temperatures based on the cost estimates developed for the ECAS study are presented. The scope of the work included technical assessment of sulfur dioxide scrubber system design, on site calcination versus purchased lime, reheat of stack gas, effect of sulfur dioxide scrubber on particulate emission, and control of nitrogen oxides.
Permanent-magnet-less synchronous reluctance system
Hsu, John S
2012-09-11
A permanent magnet-less synchronous system includes a stator that generates a magnetic revolving field when sourced by an alternating current. An uncluttered rotor is disposed within the magnetic revolving field and spaced apart from the stator to form an air gap relative to an axis of rotation. The rotor includes a plurality of rotor pole stacks having an inner periphery biased by single polarity of a north-pole field and a south-pole field, respectively. The outer periphery of each of the rotor pole stacks are biased by an alternating polarity.
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
The case for open-source software in drug discovery.
DeLano, Warren L
2005-02-01
Widespread adoption of open-source software for network infrastructure, web servers, code development, and operating systems leads one to ask how far it can go. Will "open source" spread broadly, or will it be restricted to niches frequented by hopeful hobbyists and midnight hackers? Here we identify reasons for the success of open-source software and predict how consumers in drug discovery will benefit from new open-source products that address their needs with increased flexibility and in ways complementary to proprietary options.
ERIC Educational Resources Information Center
Thankachan, Briju; Moore, David Richard
2017-01-01
The use of Free and Open Source Software (FOSS), a subset of Information and Communication Technology (ICT), can reduce the cost of purchasing software. Despite the benefit in the initial purchase price of software, deploying software requires total cost that goes beyond the initial purchase price. Total cost is a silent issue of FOSS and can only…
Auscope: Australian Earth Science Information Infrastructure using Free and Open Source Software
NASA Astrophysics Data System (ADS)
Woodcock, R.; Cox, S. J.; Fraser, R.; Wyborn, L. A.
2013-12-01
Since 2005 the Australian Government has supported a series of initiatives providing researchers with access to major research facilities and information networks necessary for world-class research. Starting with the National Collaborative Research Infrastructure Strategy (NCRIS) the Australian earth science community established an integrated national geoscience infrastructure system called AuScope. AuScope is now in operation, providing a number of components to assist in understanding the structure and evolution of the Australian continent. These include the acquisition of subsurface imaging , earth composition and age analysis, a virtual drill core library, geological process simulation, and a high resolution geospatial reference framework. To draw together information from across the earth science community in academia, industry and government, AuScope includes a nationally distributed information infrastructure. Free and Open Source Software (FOSS) has been a significant enabler in building the AuScope community and providing a range of interoperable services for accessing data and scientific software. A number of FOSS components have been created, adopted or upgraded to create a coherent, OGC compliant Spatial Information Services Stack (SISS). SISS is now deployed at all Australian Geological Surveys, many Universities and the CSIRO. Comprising a set of OGC catalogue and data services, and augmented with new vocabulary and identifier services, the SISS provides a comprehensive package for organisations to contribute their data to the AuScope network. This packaging and a variety of software testing and documentation activities enabled greater trust and notably reduced barriers to adoption. FOSS selection was important, not only for technical capability and robustness, but also for appropriate licensing and community models to ensure sustainability of the infrastructure in the long term. Government agencies were sensitive to these issues and AuScope's careful selection has been rewarded by adoption. In some cases the features provided by the SISS solution are now significantly in advance of COTS offerings which will create expectations that can be passed back from users to their preferred vendors. Using FOSS, AuScope has addressed the challenge of data exchange across organisations nationally. The data standards (e.g. GeosciML) and platforms that underpin AuScope provide important new datasets and multi-agency links independent of underlying software and hardware differences. AuScope has created an infrastructure, a platform of technologies and the opportunity for new ways of working with and integrating disparate data at much lower cost. Research activities are now exploiting the information infrastructure to create virtual laboratories for research ranging from geophysics through water and the environment. Once again the AuScope community is making heavy use of FOSS to provide access to processing software and Cloud computing and HPC. The successful use of FOSS by AuScope, and the efforts made to ensure it is suitable for adoption, have resulted in the SISS being selected as a reference implementation for a number of Australian Government initiatives beyond AuScope in environmental information and bioregional assessments.
Open Source Software in Medium Size Organizations: Key Factors for Adoption
ERIC Educational Resources Information Center
Solomon, Jerry T.
2010-01-01
For-profit organizations are constantly evaluating new technologies to gain competitive advantage. One such technology, application software, has changed significantly over the past 25 years with the introduction of Open Source Software (OSS). In contrast to commercial software that is developed by private companies and sold to organizations, OSS…
TEJAS - TELEROBOTICS/EVA JOINT ANALYSIS SYSTEM VERSION 1.0
NASA Technical Reports Server (NTRS)
Drews, M. L.
1994-01-01
The primary objective of space telerobotics as a research discipline is the augmentation and/or support of extravehicular activity (EVA) with telerobotic activity; this allows increased emplacement of on-orbit assets while providing for their "in situ" management. Development of the requisite telerobot work system requires a well-understood correspondence between EVA and telerobotics that to date has been only partially established. The Telerobotics/EVA Joint Analysis Systems (TEJAS) hypermedia information system uses object-oriented programming to bridge the gap between crew-EVA and telerobotics activities. TEJAS Version 1.0 contains twenty HyperCard stacks that use a visual, customizable interface of icon buttons, pop-up menus, and relational commands to store, link, and standardize related information about the primitives, technologies, tasks, assumptions, and open issues involved in space telerobot or crew EVA tasks. These stacks are meant to be interactive and can be used with any database system running on a Macintosh, including spreadsheets, relational databases, word-processed documents, and hypermedia utilities. The software provides a means for managing volumes of data and for communicating complex ideas, relationships, and processes inherent to task planning. The stack system contains 3MB of data and utilities to aid referencing, discussion, communication, and analysis within the EVA and telerobotics communities. The six baseline analysis stacks (EVATasks, EVAAssume, EVAIssues, TeleTasks, TeleAssume, and TeleIssues) work interactively to manage and relate basic information which you enter about the crew-EVA and telerobot tasks you wish to analyze in depth. Analysis stacks draw on information in the Reference stacks as part of a rapid point-and-click utility for building scripts of specific task primitives or for any EVA or telerobotics task. Any or all of these stacks can be completely incorporated within other hypermedia applications, or they can be referenced as is, without requiring data to be transferred into any other database. TEJAS is simple to use and requires no formal training. Some knowledge of HyperCard is helpful, but not essential. All Help cards printed in the TEJAS User's Guide are part of the TEJAS Help Stack and are available from a pop-up menu any time you are using TEJAS. Specific stacks created in TEJAS can be exchanged between groups, divisions, companies, or centers for complete communication of fundamental information that forms the basis for further analyses. TEJAS runs on any Apple Macintosh personal computer with at least one megabyte of RAM, a hard disk, and HyperCard 1.21, or later version. TEJAS is a copyrighted work with all copyright vested in NASA. HyperCard and Macintosh are registered trademarks of Apple Computer, Inc.
Grigoryan, Artyom M; Dougherty, Edward R; Kononen, Juha; Bubendorf, Lukas; Hostetter, Galen; Kallioniemi, Olli
2002-01-01
Fluorescence in situ hybridization (FISH) is a molecular diagnostic technique in which a fluorescent labeled probe hybridizes to a target nucleotide sequence of deoxyribose nucleic acid. Upon excitation, each chromosome containing the target sequence produces a fluorescent signal (spot). Because fluorescent spot counting is tedious and often subjective, automated digital algorithms to count spots are desirable. New technology provides a stack of images on multiple focal planes throughout a tissue sample. Multiple-focal-plane imaging helps overcome the biases and imprecision inherent in single-focal-plane methods. This paper proposes an algorithm for global spot counting in stacked three-dimensional slice FISH images without the necessity of nuclei segmentation. It is designed to work in complex backgrounds, when there are agglomerated nuclei, and in the presence of illumination gradients. It is based on the morphological top-hat transform, which locates intensity spikes on irregular backgrounds. After finding signals in the slice images, the algorithm groups these together to form three-dimensional spots. Filters are employed to separate legitimate spots from fluorescent noise. The algorithm is set in a comprehensive toolbox that provides visualization and analytic facilities. It includes simulation software that allows examination of algorithm performance for various image and algorithm parameter settings, including signal size, signal density, and the number of slices.
A Fault Oblivious Extreme-Scale Execution Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKie, Jim
The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massivemore » data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations employing work stealing for load balancing that scaled to the largest existing supercomputers. Finally, we implemented the Elastic Building Blocks runtime, a library to manage object-oriented distributed software components. To support the research, we won two INCITE awards for time on Intrepid (BG/P) and Mira (BG/Q). Much of our work has had impact in the OS and runtime community through the ASCR Exascale OS/R workshop and report, leading to the research agenda of the Exascale OS/R program. Our project was, however, also affected by attrition of multiple PIs. While the PIs continued to participate and offer guidance as time permitted, losing these key individuals was unfortunate both for the project and for the DOE HPC community.« less
NASA Astrophysics Data System (ADS)
Zhang, Xigui; Zheng, Dan; Wang, Tao; Chen, Cong; Cao, Jianyu; Yan, Jian; Wang, Wenming; Liu, Juanying; Liu, Haohan; Tian, Juan; Li, Xinxin; Yang, Hui; Xia, Baojia
The fabrication and performance evaluation of a miniature 6-cell PEMFC stack based on Micro-Electronic-Mechanical-System (MEMS) technology is presented in this paper. The stack with a planar configuration consists of 6-cells in serial interconnection by spot welding one cell anode with another cell cathode. Each cell was made by sandwiching a membrane-electrode-assembly (MEA) between two flow field plates fabricated by a classical MEMS wet etching method using silicon wafer as the original material. The plates were made electrically conductive by sputtering a Ti/Pt/Au composite metal layer on their surfaces. The 6-cells lie in the same plane with a fuel buffer/distributor as their support, which was fabricated by the MEMS silicon-glass bonding technology. A small hydrogen storage canister was used as fuel source. Operating on dry H 2 at a 40 ml min -1 flow rate and air-breathing conditions at room temperature and atmospheric pressure, the linear polarization experiment gave a measured peak power of 0.9 W at 250 mA cm -2 for the stack and average power density of 104 mW cm -2 for each cell. The results suggested that the stack has reasonable performance benefiting from an even fuel supply. But its performance tended to deteriorate with power increase, which became obvious at 600 mW. This suggests that the stack may need some power assistance, from say supercapacitors to maintain its stability when operated at higher power.
The SBOL Stack: A Platform for Storing, Publishing, and Sharing Synthetic Biology Designs.
Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Pocock, Matthew; Flanagan, Keith; Hallinan, Jennifer; Wipat, Anil
2016-06-17
Recently, synthetic biologists have developed the Synthetic Biology Open Language (SBOL), a data exchange standard for descriptions of genetic parts, devices, modules, and systems. The goals of this standard are to allow scientists to exchange designs of biological parts and systems, to facilitate the storage of genetic designs in repositories, and to facilitate the description of genetic designs in publications. In order to achieve these goals, the development of an infrastructure to store, retrieve, and exchange SBOL data is necessary. To address this problem, we have developed the SBOL Stack, a Resource Description Framework (RDF) database specifically designed for the storage, integration, and publication of SBOL data. This database allows users to define a library of synthetic parts and designs as a service, to share SBOL data with collaborators, and to store designs of biological systems locally. The database also allows external data sources to be integrated by mapping them to the SBOL data model. The SBOL Stack includes two Web interfaces: the SBOL Stack API and SynBioHub. While the former is designed for developers, the latter allows users to upload new SBOL biological designs, download SBOL documents, search by keyword, and visualize SBOL data. Since the SBOL Stack is based on semantic Web technology, the inherent distributed querying functionality of RDF databases can be used to allow different SBOL stack databases to be queried simultaneously, and therefore, data can be shared between different institutes, centers, or other users.
Open source molecular modeling.
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-09-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
Universal programming interface with concurrent access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alferov, Oleg
2004-10-07
There exist a number of devices with a positioning nature of operation, such as mechanical linear stages, temperature controllers, or filterwheels with discrete state, and most of them have different programming interfaces. The Universal Positioner software suggests the way to handle all of them is with a single approach, whereby a particular hardware driver is created from the template and by translating the actual commands used by the hardware to and from the universal programming interface. The software contains the universal API module itself, the demo simulation of hardware, and the front-end programs to help developers write their own softwaremore » drivers along with example drivers for actual hardware controllers. The software allows user application programs to call devices simultaneously without race conditions (multitasking and concurrent access). The template suggested in this package permits developers to integrate various devices easily into their applications using the same API. The drivers can be stacked; i.e., they can call each other via the same interface.« less
Esteban, Santiago; Rodríguez Tablado, Manuel; Peper, Francisco; Mahumud, Yamila S; Ricci, Ricardo I; Kopitowski, Karin; Terrasa, Sergio
2017-01-01
Precision medicine requires extremely large samples. Electronic health records (EHR) are thought to be a cost-effective source of data for that purpose. Phenotyping algorithms help reduce classification errors, making EHR a more reliable source of information for research. Four algorithm development strategies for classifying patients according to their diabetes status (diabetics; non-diabetics; inconclusive) were tested (one codes-only algorithm; one boolean algorithm, four statistical learning algorithms and six stacked generalization meta-learners). The best performing algorithms within each strategy were tested on the validation set. The stacked generalization algorithm yielded the highest Kappa coefficient value in the validation set (0.95 95% CI 0.91, 0.98). The implementation of these algorithms allows for the exploitation of data from thousands of patients accurately, greatly reducing the costs of constructing retrospective cohorts for research.
Resolution Analysis of finite fault inversions: A back-projection approach.
NASA Astrophysics Data System (ADS)
Ji, C.; Shao, G.
2007-12-01
The resolution of inverted source models of large earthquakes is controlled by frequency contents of "coherent" (or "useful") seismic observations and their spatial distribution. But it is difficult to distinguish whether some features consistent during different inversions are really required by data or a consequence of "prior" information, such as velocity structures, fault geometry, model parameterizations. Here, we investigate the model spatial resolution by first back projecting and stacking the data at the source regions and then analyzing the spatial- temporal variations of the focusing regions, which arbitrarily defined as the regions with 90% of the peak focusing amplitude. Our preliminary results indicated 1) The spatial-temporal resolution at a particularly direction is controlled by the region of directivity parameter [pcos(θ)] within the seismic network, where p is the horizontal slowness from the hypocenter and θ is the difference between the station azimuth and this orientation. Therefore, the network aperture is more important than the number of stations. 2) Simple stacking method is a robust method to capture the asperities but the sizes of focusing regions are usually much larger than what data could resolve. By carefully weighting the data before the stacking could enhance the spatial resolution in a particular direction. 3) The results based on the teleseismic P waves of a local network usually surfers the trade-off between the source's spatial location and its rupture time. The resolution of the 2001 Kunlunshan earthquake and 2006 Kuril island earthquake will be investigated.
Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks
NASA Astrophysics Data System (ADS)
Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.
2010-12-01
Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).
Data Management for Mars Exploration Rovers
NASA Technical Reports Server (NTRS)
Snyder, Joseph F.; Smyth, David E.
2004-01-01
Data Management for the Mars Exploration Rovers (MER) project is a comprehensive system addressing the needs of development, test, and operations phases of the mission. During development of flight software, including the science software, the data management system can be simulated using any POSIX file system. During testing, the on-board file system can be bit compared with files on the ground to verify proper behavior and end-to-end data flows. During mission operations, end-to-end accountability of data products is supported, from science observation concept to data products within the permanent ground repository. Automated and human-in-the-loop ground tools allow decisions regarding retransmitting, re-prioritizing, and deleting data products to be made using higher level information than is available to a protocol-stack approach such as the CCSDS File Delivery Protocol (CFDP).
dCache, towards Federated Identities & Anonymized Delegation
NASA Astrophysics Data System (ADS)
Ashish, A.; Millar, AP; Mkrtchyan, T.; Fuhrmann, P.; Behrmann, G.; Sahakyan, M.; Adeyemi, O. S.; Starek, J.; Litvintsev, D.; Rossi, A.
2017-10-01
For over a decade, dCache has relied on the authentication and authorization infrastructure (AAI) offered by VOMS, Kerberos, Xrootd etc. Although the established infrastructure has worked well and provided sufficient security, the implementation of procedures and the underlying software is often seen as a burden, especially by smaller communities trying to adopt existing HEP software stacks [1]. Moreover, scientists are increasingly dependent on service portals for data access [2]. In this paper, we describe how federated identity management systems can facilitate the transition from traditional AAI infrastructure to novel solutions like OpenID Connect. We investigate the advantages offered by OpenID Connect in regards to ‘delegation of authentication’ and ‘credential delegation for offline access’. Additionally, we demonstrate how macaroons can provide a more fine-granular authorization mechanism that supports anonymized delegation.
X-band T/R switch with body-floating multi-gate PDSOI NMOS transistors
NASA Astrophysics Data System (ADS)
Park, Mingyo; Min, Byung-Wook
2018-03-01
This paper presents an X-band transmit/receive switch using multi-gate NMOS transistors in a silicon-on-insulator CMOS process. For low loss and high power handling capability, floating body multi-gate NMOS transistors are adopted instead of conventional stacked NMOS transistors, resulting in 53% reduction of transistor area. Comparing to the stacked NMOS transistors, the multi gate transistor shares the source and drain region between stacked transistors, resulting in reduced chip area and parasitics. The impedance between bodies of gates in multi-gate NMOS transistors is assumed to be very large during design and confirmed after measurement. The measured input 1 dB compression point is 34 dBm. The measured insertion losses of TX and RX modes are respectively 1.7 dB and 2.0 dB at 11 GHz, and the measured isolations of TX and RX modes are >27 dB and >20 dB in X-band, respectively. The chip size is 0.086 mm2 without pads, which is 25% smaller than the T/R switch with stacked transistors.
Laser diode stack beam shaping for efficient and compact long-range laser illuminator design
NASA Astrophysics Data System (ADS)
Lutz, Y.; Poyet, J. M.
2014-04-01
Laser diode stacks are interesting laser sources for active imaging illuminators. They allow the accumulation of large amounts of energy in multi-pulse mode, which is best suited for long-range image recording. Even when the laser diode stacks are equipped with fast-axis collimation (FAC) and slow-axis collimation (SAC) micro-lenses, their beam parameter products BPP are not compatible with direct use in highly efficient and compact illuminators. This is particularly true when narrow divergences are required such as for long-range applications. A solution to overcome these difficulties is to enhance the poor slow-axis BPP by virtually restacking the laser diode stack. We present a beam shaping and homogenization method that is low-cost and efficient and has low alignment sensitivity. After conducting simulations, we have realized and characterized the illuminator. A compact long-range laser illuminator has been set up with a divergence of 3.5×2.6 mrad and a global efficiency of 81%. Here, a projection lens with a clear aperture of 62 mm and a focal length of 571 mm was used.
Managing Information On Technical Requirements
NASA Technical Reports Server (NTRS)
Mauldin, Lemuel E., III; Hammond, Dana P.
1993-01-01
Technical Requirements Analysis and Control Systems/Initial Operating Capability (TRACS/IOC) computer program provides supplemental software tools for analysis, control, and interchange of project requirements so qualified project members have access to pertinent project information, even if in different locations. Enables users to analyze and control requirements, serves as focal point for project requirements, and integrates system supporting efficient and consistent operations. TRACS/IOC is HyperCard stack for use on Macintosh computers running HyperCard 1.2 or later and Oracle 1.2 or later.
Ketoff, Serge; Khonsari, Roman Hossein; Schouman, Thomas; Bertolus, Chloé
2014-11-01
Handling 3-dimensional reconstructions of computed tomographic scans on portable devices is problematic because of the size of the Digital Imaging and Communications in Medicine (DICOM) stacks. The authors provide a user-friendly method allowing the production, transfer, and sharing of good-quality 3-dimensional reconstructions on smartphones and tablets. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
ASDC Advances in the Utilization of Microservices and Hybrid Cloud Environments
NASA Astrophysics Data System (ADS)
Baskin, W. E.; Herbert, A.; Mazaika, A.; Walter, J.
2017-12-01
The Atmospheric Science Data Center (ASDC) is transitioning many of its software tools and applications to standalone microservices deployable in a hybrid cloud, offering benefits such as scalability and efficient environment management. This presentation features several projects the ASDC staff have implemented leveraging the OpenShift Container Application Platform and OpenStack Hybrid Cloud Environment focusing on key tools and techniques applied to: Earth Science data processing Spatial-Temporal metadata generation, validation, repair, and curation Archived Data discovery, visualization, and access
Low-Cost High-Pressure Hydrogen Generator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cropley, Cecelia C.; Norman, Timothy J.
Electrolysis of water, particularly in conjunction with renewable energy sources, is potentially a cost-effective and environmentally friendly method of producing hydrogen at dispersed forecourt sites, such as automotive fueling stations. The primary feedstock for an electrolyzer is electricity, which could be produced by renewable sources such as wind or solar that do not produce carbon dioxide or other greenhouse gas emissions. However, state-of-the-art electrolyzer systems are not economically competitive for forecourt hydrogen production due to their high capital and operating costs, particularly the cost of the electricity used by the electrolyzer stack. In this project, Giner Electrochemical Systems, LLC (GES)more » developed a low cost, high efficiency proton-exchange membrane (PEM) electrolysis system for hydrogen production at moderate pressure (300 to 400 psig). The electrolyzer stack operates at differential pressure, with hydrogen produced at moderate pressure while oxygen is evolved at near-atmospheric pressure, reducing the cost of the water feed and oxygen handling subsystems. The project included basic research on catalysts and membranes to improve the efficiency of the electrolysis reaction as well as development of advanced materials and component fabrication methods to reduce the capital cost of the electrolyzer stack and system. The project culminated in delivery of a prototype electrolyzer module to the National Renewable Energy Laboratory for testing at the National Wind Technology Center. Electrolysis cell efficiency of 72% (based on the lower heating value of hydrogen) was demonstrated using an advanced high-strength membrane developed in this project. This membrane would enable the electrolyzer system to exceed the DOE 2012 efficiency target of 69%. GES significantly reduced the capital cost of a PEM electrolyzer stack through development of low cost components and fabrication methods, including a 60% reduction in stack parts count. Economic analysis indicates that hydrogen could be produced for $3.79 per gge at an electricity cost of $0.05/kWh by the lower-cost PEM electrolyzer developed in this project, assuming high-volume production of large-scale electrolyzer systems.« less
Developing an Open Source Option for NASA Software
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Parks, John W. (Technical Monitor)
2003-01-01
We present arguments in favor of developing an Open Source option for NASA software; in particular we discuss how Open Source is compatible with NASA's mission. We compare and contrast several of the leading Open Source licenses, and propose one - the Mozilla license - for use by NASA. We also address some of the related issues for NASA with respect to Open Source. In particular, we discuss some of the elements in the External Release of NASA Software document (NPG 2210.1A) that will likely have to be changed in order to make Open Source a reality withm the agency.
Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.
Benson, Tim
2016-07-04
Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.
NASA Astrophysics Data System (ADS)
Grüebler, Martin U.; Widmer, Silv; Korner-Nievergelt, Fränzi; Naef-Daenzer, Beat
2014-07-01
The microclimate of potential roost-sites is likely to be a crucial determinant in the optimal roost-site selection of endotherms, in particular during the winter season of temperate zones. Available roost-sites for birds and mammals in European high trunk orchards are mainly tree cavities, wood stacks and artificial nest boxes. However, little is known about the microclimatic patterns inside cavities and thermal advantages of using these winter roost-sites. Here, we simultaneously investigate the thermal patterns of winter roost-sites in relation to winter ambient temperature and their insulation capacity. While tree cavities and wood stacks strongly buffered the daily cycle of temperature changes, nest boxes showed low buffering capacity. The buffering effect of tree cavities was stronger at extreme ambient temperatures compared to temperatures around zero. Heat sources inside roosts amplified Δ T (i.e., the difference between inside and outside temperatures), particularly in the closed roosts of nest boxes and tree cavities, and less in the open wood stacks with stronger circulation of air. Positive Δ T due to the installation of a heat source increased in cold ambient temperatures. These results suggest that orchard habitats in winter show a spatiotemporal mosaic of sites providing different thermal benefits varying over time and in relation to ambient temperatures. At cold temperatures tree cavities provide significantly higher thermal benefits than nest boxes or wood stacks. Thus, in winter ecology of hole-using endotherms, the availability of tree cavities may be an important characteristic of winter habitat quality.
ERIC Educational Resources Information Center
Williams van Rooij, Shahron
2010-01-01
This paper contrasts the arguments offered in the literature advocating the adoption of open source software (OSS)--software delivered with its source code--for teaching and learning applications, with the reality of limited enterprise-wide deployment of those applications in U.S. higher education. Drawing on the fields of organizational…
Looking toward the Future: A Case Study of Open Source Software in the Humanities
ERIC Educational Resources Information Center
Quamen, Harvey
2006-01-01
In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…
ERIC Educational Resources Information Center
Lin, Yu-Wei; Zini, Enrico
2008-01-01
This empirical paper shows how free/libre open source software (FLOSS) contributes to mutual and collaborative learning in an educational environment. Unlike proprietary software, FLOSS allows extensive customisation of software to support the needs of local users better. This also allows users to participate more proactively in the development…
Shaping Software Engineering Curricula Using Open Source Communities: A Case Study
ERIC Educational Resources Information Center
Bowring, James; Burke, Quinn
2016-01-01
This paper documents four years of a novel approach to teaching a two-course sequence in software engineering as part of the ABET-accredited computer science curriculum at the College of Charleston. This approach is team-based and centers on learning software engineering in the context of open source software projects. In the first course, teams…
The Value of Open Source Software Tools in Qualitative Research
ERIC Educational Resources Information Center
Greenberg, Gary
2011-01-01
In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…
Software Model Checking Without Source Code
NASA Technical Reports Server (NTRS)
Chaki, Sagar; Ivers, James
2009-01-01
We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.
Open source software integrated into data services of Japanese planetary explorations
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.
2015-12-01
Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.
Interim Open Source Software (OSS) Policy
This interim Policy establishes a framework to implement the requirements of the Office of Management and Budget's (OMB) Federal Source Code Policy to achieve efficiency, transparency and innovation through reusable and open source software.
Automated verification of flight software. User's manual
NASA Technical Reports Server (NTRS)
Saib, S. H.
1982-01-01
(Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.
48 CFR 208.7400 - Scope of subpart.
Code of Federal Regulations, 2013 CFR
2013-10-01
... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software... commercial software and software maintenance, including software and software maintenance that is acquired...
48 CFR 208.7400 - Scope of subpart.
Code of Federal Regulations, 2012 CFR
2012-10-01
... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software... commercial software and software maintenance, including software and software maintenance that is acquired...
48 CFR 208.7400 - Scope of subpart.
Code of Federal Regulations, 2011 CFR
2011-10-01
... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software... commercial software and software maintenance, including software and software maintenance that is acquired...
48 CFR 208.7400 - Scope of subpart.
Code of Federal Regulations, 2014 CFR
2014-10-01
... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software... commercial software and software maintenance, including software and software maintenance that is acquired...
NASA Astrophysics Data System (ADS)
Kaip, G.; Harder, S. H.; Karplus, M. S.; Vennemann, A.
2016-12-01
In May 2016, the National Seismic Source Facility (NSSF) located at the University of Texas at El Paso (UTEP) Department of Geological Sciences collected seismic data at the Indio Ranch located 30 km southwest of Van Horn, Texas. Both hammer on an aluminum plate and explosive sources were used. The project objective was to image subsurface structures at the ranch, owned by UTEP. Selecting the appropriate seismic source is important to reach project objectives. We compare seismic sources between explosions and hammer on plate, focusing on amplitude and frequency. The seismic line was 1 km long, trending WSW to ENE, with 200 4.5 Hz geophones at 5m spacing and shot locations at 10m spacing. Clay slurry was used in shot holes to increase shot coupling around booster. Trojan Spartan cast boosters (150g) were used in explosive sources in each shot hole (1 hole per station). The end of line shots had 5 shot holes instead of 1 (750g total). The hammer source utilized a 5.5 kg hammer and an aluminum plate. Five hammer blows were stacked at each location to improve signal-to-noise ratio. Explosive sources yield higher amplitude, but lower frequency content. The explosions exhibit a higher signal-to-noise ratio, allowing us to recognize seismic energy deeper and farther from the source. Hammer sources yield higher frequencies, allowing better resolution at shallower depths but have a lower signal-to-noise ratio and lower amplitudes, even with source stacking. We analyze the details of the shot spectra from the different types of sources. A combination of source types can improve data resolution and amplitude, thereby improving imaging potential. However, cost, logistics, and complexities also have a large influence on source selection.
A software to digital image processing to be used in the voxel phantom development.
Vieira, J W; Lima, F R A
2009-11-15
Anthropomorphic models used in computational dosimetry, also denominated phantoms, are based on digital images recorded from scanning of real people by Computed Tomography (CT) or Magnetic Resonance Imaging (MRI). The voxel phantom construction requests computational processing for transformations of image formats, to compact two-dimensional (2-D) images forming of three-dimensional (3-D) matrices, image sampling and quantization, image enhancement, restoration and segmentation, among others. Hardly the researcher of computational dosimetry will find all these available abilities in single software, and almost always this difficulty presents as a result the decrease of the rhythm of his researches or the use, sometimes inadequate, of alternative tools. The need to integrate the several tasks mentioned above to obtain an image that can be used in an exposure computational model motivated the development of the Digital Image Processing (DIP) software, mainly to solve particular problems in Dissertations and Thesis developed by members of the Grupo de Pesquisa em Dosimetria Numérica (GDN/CNPq). Because of this particular objective, the software uses the Portuguese idiom in their implementations and interfaces. This paper presents the second version of the DIP, whose main changes are the more formal organization on menus and menu items, and menu for digital image segmentation. Currently, the DIP contains the menus Fundamentos, Visualizações, Domínio Espacial, Domínio de Frequências, Segmentações and Estudos. Each menu contains items and sub-items with functionalities that, usually, request an image as input and produce an image or an attribute in the output. The DIP reads edits and writes binary files containing the 3-D matrix corresponding to a stack of axial images from a given geometry that can be a human body or other volume of interest. It also can read any type of computational image and to make conversions. When the task involves only an output image, this is saved as a JPEG file in the Windows default; when it involves an image stack, the output binary file is denominated SGI (Simulações Gráficas Interativas (Interactive Graphic Simulations), an acronym already used in other publications of the GDN/CNPq.
Evaluating emissions of HCHO, HONO, NO2, and SO2 from point sources using portable Imaging DOAS
NASA Astrophysics Data System (ADS)
Pikelnaya, O.; Tsai, C.; Herndon, S. C.; Wood, E. C.; Fu, D.; Lefer, B. L.; Flynn, J. H.; Stutz, J.
2011-12-01
Our ability to quantitatively describe urban air pollution to a large extent depends on an accurate understanding of anthropogenic emissions. In areas with a high density of individual point sources of pollution, such as petrochemical facilities with multiple flares or regions with active commercial ship traffic, this is particularly challenging as access to facilities and ships is often restricted. Direct formaldehyde emissions from flares may play an important role for ozone chemistry, acting as an initial radical precursor and enhancing the degradation of co-emitted hydrocarbons. HONO is also recognized as an important OH source throughout the day. However, very little is known about direct HCHO and HONO emissions. Imaging Differential Optical Absorption Spectroscopy (I-DOAS), a relatively new remote sensing technique, provides an opportunity to investigate emissions from these sources from a distance, making this technique attractive for fence-line monitoring. In this presentation, we will describe I-DOAS measurements during the FLAIR campaign in the spring/summer of 2009. We performed measurements outside of various industrial facilities in the larger Houston area as well as in the Houston Ship Channel to visualize and quantify the emissions of HCHO, NO2, HONO, and SO2 from flares of petrochemical facilities and ship smoke stacks. We will present the column density images of pollutant plumes as well as fluxes from individual flares calculated from I-DOAS observations. Fluxes from individual flares and smoke stacks determined from the I-DOAS measurements vary widely in time and by the emission sources. We will also present HONO/NOx ratios in ship smoke stacks derived from the combination of I-DOAS and in-situ measurements, and discuss other trace gas ratios in plumes derived from the I-DOAS observations. Finally, we will show images of HCHO, NO2 and SO2 plumes from control burn forest fires observed in November of 2009 at Vandenberg Air Force Base, Santa Maria, CA.
Sources of dioxins in the United Kingdom: the steel industry and other sources.
Anderson, David R; Fisher, Raymond
2002-01-01
Several countries have compiled national inventories of dioxin (polychlorinated dibenzo-p-dioxin [PCDD] and polychlorinated dibenzofuran [PCDF]) releases that detail annual mass emission estimates for regulated sources. High temperature processes, such as commercial waste incineration and iron ore sintering used in the production of iron and steel, have been identified as point sources of dioxins. Other important releases of dioxins are from various diffuse sources such as bonfire burning and domestic heating. The PCDD/F inventory for emissions to air in the UK has decreased significantly from 1995 to 1998 because of reduced emissions from waste incinerators which now generally operate at waste gas stack emissions of 1 ng I-TEQ/Nm3 or below. The iron ore sintering process is the only noteworthy source of PCDD/Fs at integrated iron and steelworks operated by Corus (formerly British Steel plc) in the UK. The mean waste gas stack PCDD/F concentration for this process is 1,2 ng I-TEQ/Nm3 based on 94 measurements and it has been estimated that this results in an annual mass release of approximately 38 g I-TEQ per annum. Diffuse sources now form a major contribution to the UK inventory as PCDD/Fs from regulated sources have decreased, for example, the annual celebration of Bonfire Night on 5th November in the UK causes an estimated release of 30 g I-TEQ, similar to that emitted by five sinter plants in the UK.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steger, J.L.; Bursey, J.T.; Merrill, R.G.
1999-03-01
This report presents the results of laboratory studies to develop and evaluate a method for the sampling and analysis of phosgene from stationary sources of air emissions using diethylamine (DEA) in toluene as the collection media. The method extracts stack gas from emission sources and stabilizes the reactive gas for subsequent analysis. DEA was evaluated both in a benchtop study and in a laboratory train spiking study. This report includes results for both the benchtop study and the train spiking study. Benchtop studies to evaluate the suitability of DEA for collecting and analyzing phosgene investigated five variables: storage time, DEAmore » concentration, moisture/pH, phosgene concentration, and sample storage temperature. Prototype sampling train studies were performed to determine if the benchtop chemical studies were transferable to a Modified Method 5 sampling train collecting phosgene in the presence of clean air mixed with typical stack gas components. Four conditions, which varied the moisture and phosgene spike were evaluated in triplicate. In addition to research results, the report includes a detailed draft method for sampling and analysis of phosgene from stationary source emissions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Xiangfang; Lancelle, Chelsea; Thurber, Clifford
A field test that was conducted at Garner Valley, California, on 11 and 12 September 2013 using distributed acoustic sensing (DAS) to sense ground vibrations provided a continuous overnight record of ambient noise. Furthermore, the energy of ambient noise was concentrated between 5 and 25 Hz, which falls into the typical traffic noise frequency band. A standard procedure (Bensen et al., 2007) was adopted to calculate noise cross-correlation functions (NCFs) for 1-min intervals. The 1-min-long NCFs were stacked using the time–frequency domain phase-weighted-stacking method, which significantly improves signal quality. The obtained NCFs were asymmetrical, which was a result of themore » nonuniform distributed noise sources. A precursor appeared on NCFs along one segment, which was traced to a strong localized noise source or a scatterer at a nearby road intersection. NCF for the radial component of two surface accelerometers along a DAS profile gave similar results to those from DAS channels. Here, we calculated the phase velocity dispersion from DAS NCFs using the multichannel analysis of surface waves technique, and the result agrees with active-source results. We then conclude that ambient noise sources and the high spatial sampling of DAS can provide the same subsurface information as traditional active-source methods.« less
Zeng, Xiangfang; Lancelle, Chelsea; Thurber, Clifford; ...
2017-01-31
A field test that was conducted at Garner Valley, California, on 11 and 12 September 2013 using distributed acoustic sensing (DAS) to sense ground vibrations provided a continuous overnight record of ambient noise. Furthermore, the energy of ambient noise was concentrated between 5 and 25 Hz, which falls into the typical traffic noise frequency band. A standard procedure (Bensen et al., 2007) was adopted to calculate noise cross-correlation functions (NCFs) for 1-min intervals. The 1-min-long NCFs were stacked using the time–frequency domain phase-weighted-stacking method, which significantly improves signal quality. The obtained NCFs were asymmetrical, which was a result of themore » nonuniform distributed noise sources. A precursor appeared on NCFs along one segment, which was traced to a strong localized noise source or a scatterer at a nearby road intersection. NCF for the radial component of two surface accelerometers along a DAS profile gave similar results to those from DAS channels. Here, we calculated the phase velocity dispersion from DAS NCFs using the multichannel analysis of surface waves technique, and the result agrees with active-source results. We then conclude that ambient noise sources and the high spatial sampling of DAS can provide the same subsurface information as traditional active-source methods.« less
Energy Savings Potential of SSL in Horticultural Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stober, Kelsey; Lee, Kyung; Yamada, Mary
Report that presents the findings for horticultural lighting applications where light-emitting diode (LED) products are now competing with traditional light sources. The three main categories of indoor horticulture were investigated: supplemented greenhouses, which use electric lighting to extend the hours of daylight, supplement low levels of sunlight on cloudy days, or disrupt periods of darkness to alter plant growth; non-stacked indoor farms, where plants are grown in a single layer on the floor under ceiling-mounted lighting; and vertical farms, where plants are stacked along vertical shelving to maximize grow space, and the lighting is typically mounted within the shelving units.
Imaging the Lower Crust and Moho Beneath Long Beach, CA Using Autocorrelations
NASA Astrophysics Data System (ADS)
Clayton, R. W.
2017-12-01
Three-dimensional images of the lower crust and Moho in a 10x10 km region beneath Long Beach, CA are constructed from autocorrelations of ambient noise. The results show the Moho at a depth of 15 km at the coast and dipping at 45 degrees inland to a depth of 25 km. The shape of the Moho interface is irregular in both the coast perpendicular and parallel directions. The lower crust appears as a zone of enhanced reflectivity with numerous small-scale structures. The autocorrelations are constructed from virtual source gathers that were computed from the dense Long Beach array that were used in the Lin et al (2013) study. All near zero-offset traces within a 200 m disk are stacked to produce a single autocorrelation at that point. The stack typically is over 50-60 traces. To convert the auto correlation to reflectivity as in Claerbout (1968), the noise source autocorrelation, which is estimated as the average of all autocorrelations is subtracted from each trace. The subsurface image is then constructed with a 0.1-2 Hz filter and AGC scaling. The main features of the image are confirmed with broadband receiver functions from the LASSIE survey (Ma et al, 2016). The use of stacked autocorrelations extends ambient noise into the lower crust.
Tests of by-pass diodes at cryogenic temperatures for the KATRIN magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gil, W.; Bolz, H.; Jansen, A.
The Karlsruhe Tritium Neutrino experiment (KATRIN) requires a series of superconducting solenoid magnets for guiding beta-electrons from the source to the detector. By-pass diodes will operate at liquid helium temperatures to protect the superconducting magnets and bus bars in case of quenches. The operation conditions of the by-pass diodes depend on the different magnet systems of KATRIN. Therefore, different diode stacks are designed with adequate copper heat sinks assuming adiabatic conditions. The by-pass diode stacks have been submitted to cold tests both at liquid nitrogen and liquid helium temperatures for checking operation conditions. This report presents the test set upmore » and first results of the diode characteristics at 300 K and 77 K, as well as of endurance tests of the diode stacks at constant current load at 77 K and 4.2 K.« less
Regenerative Fuel Cell Test Rig at Glenn Research Center
NASA Technical Reports Server (NTRS)
Chang, Bei-Jiann; Johnson, Donald W.; Garcia, Christopher P.; Jakupca, Ian J.; Scullin, Vincent J.; Bents, David J.
2003-01-01
The regenerative fuel cell development effort at Glenn Research Center (GRC) involves the integration of a dedicated fuel cell and electrolyzer into an energy storage system test rig. The test rig consists of a fuel cell stack, an electrolysis stack, cooling pumps, a water transfer pump, gas recirculation pumps, phase separators, storage tanks for oxygen (O2) and hydrogen (H2), heat exchangers, isolation valves, pressure regulators, interconnecting tubing, nitrogen purge provisions, and instrumentation for control and monitoring purposes. The regenerative fuel cell (RFC) thus formed is a completely closed system which is capable of autonomous cyclic operation. The test rig provides direct current (DC) load and DC power supply to simulate power consumption and solar power input. In addition, chillers are used as the heat sink to dissipate the waste heat from the electrochemical stack operation. Various vents and nitrogen (N2) sources are included in case inert purging is necessary to safe the RFC test rig.
ERIC Educational Resources Information Center
Kapor, Mitchell
2005-01-01
Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…
A Metadata Management Framework for Collaborative Review of Science Data Products
NASA Astrophysics Data System (ADS)
Hart, A. F.; Cinquini, L.; Mattmann, C. A.; Thompson, D. R.; Wagstaff, K.; Zimdars, P. A.; Jones, D. L.; Lazio, J.; Preston, R. A.
2012-12-01
Data volumes generated by modern scientific instruments often preclude archiving the complete observational record. To compensate, science teams have developed a variety of "triage" techniques for identifying data of potential scientific interest and marking it for prioritized processing or permanent storage. This may involve multiple stages of filtering with both automated and manual components operating at different timescales. A promising approach exploits a fast, fully automated first stage followed by a more reliable offline manual review of candidate events. This hybrid approach permits a 24-hour rapid real-time response while also preserving the high accuracy of manual review. To support this type of second-level validation effort, we have developed a metadata-driven framework for the collaborative review of candidate data products. The framework consists of a metadata processing pipeline and a browser-based user interface that together provide a configurable mechanism for reviewing data products via the web, and capturing the full stack of associated metadata in a robust, searchable archive. Our system heavily leverages software from the Apache Object Oriented Data Technology (OODT) project, an open source data integration framework that facilitates the construction of scalable data systems and places a heavy emphasis on the utilization of metadata to coordinate processing activities. OODT provides a suite of core data management components for file management and metadata cataloging that form the foundation for this effort. The system has been deployed at JPL in support of the V-FASTR experiment [1], a software-based radio transient detection experiment that operates commensally at the Very Long Baseline Array (VLBA), and has a science team that is geographically distributed across several countries. Daily review of automatically flagged data is a shared responsibility for the team, and is essential to keep the project within its resource constraints. We describe the development of the platform using open source software, and discuss our experience deploying the system operationally. [1] R.B.Wayth,W.F.Brisken,A.T.Deller,W.A.Majid,D.R.Thompson, S. J. Tingay, and K. L. Wagstaff, "V-fastr: The vlba fast radio transients experiment," The Astrophysical Journal, vol. 735, no. 2, p. 97, 2011. Acknowledgement: This effort was supported by the Jet Propulsion Laboratory, managed by the California Institute of Technology under a contract with the National Aeronautics and Space Administration.
Building a Snow Data System on the Apache OODT Open Technology Stack
NASA Astrophysics Data System (ADS)
Goodale, C. E.; Painter, T. H.; Mattmann, C. A.; Hart, A. F.; Ramirez, P.; Zimdars, P.; Bryant, A. C.; Snow Data System Team
2011-12-01
Snow cover and its melt dominate regional climate and hydrology in many of the world's mountainous regions. One-sixth of Earth's population depends on snow- or glacier-melt for water resources. Operationally, seasonal forecasts of snowmelt-generated streamflow are leveraged through empirical relations based on past snowmelt periods. These historical data show that climate is changing, but the changes reduce the reliability of the empirical relations. Therefore optimal future management of snowmelt derived water resources will require explicit physical models driven by remotely sensed snow property data. Toward this goal, the Snow Optics Laboratory at the Jet Propulsion Laboratory has initiated a near real-time processing pipeline to generate and publish post-processed snow data products within a few hours of satellite acquisition. To solve this challenge, a Scientific Data Management and Processing System was required and the JPL Team leveraged an open-source project called Object Oriented Data Technology (OODT). OODT was developed within NASA's Jet Propulsion Laboratory across the last 10 years. OODT has supported various scientific data management and processing projects, providing solutions in the Earth, Planetary, and Medical science fields. It became apparent that the project needed to be opened to a larger audience to foster and promote growth and adoption. OODT was open-sourced at the Apache Software Foundation in November 2010 and has a growing community of users and committers that are constantly improving the software. Leveraging OODT, the JPL Snow Data System (SnowDS) Team was able to install and configure a core Data Management System (DMS) that would download MODIS raw data files and archive the products in a local repository for post processing. The team has since built an online data portal, and an algorithm-processing pipeline using the Apache OODT software as the foundation. We will present the working SnowDS system with its core remote sensing components: the MODIS Snow Covered Area and Grain size model (MODSCAG) and the MODIS Dust Radiative Forcing in Snow (MOD-DRFS). These products will be delivered in near real time to water managers and the broader cryosphere and climate community beginning in Winter 2012. We will then present the challenges and opportunities we see in the future as the SnowDS matures and contributions are made back to the OODT project.
NASA Astrophysics Data System (ADS)
Petrie, C.; Margaria, T.; Lausen, H.; Zaremba, M.
Explores trade-offs among existing approaches. Reveals strengths and weaknesses of proposed approaches, as well as which aspects of the problem are not yet covered. Introduces software engineering approach to evaluating semantic web services. Service-Oriented Computing is one of the most promising software engineering trends because of the potential to reduce the programming effort for future distributed industrial systems. However, only a small part of this potential rests on the standardization of tools offered by the web services stack. The larger part of this potential rests upon the development of sufficient semantics to automate service orchestration. Currently there are many different approaches to semantic web service descriptions and many frameworks built around them. A common understanding, evaluation scheme, and test bed to compare and classify these frameworks in terms of their capabilities and shortcomings, is necessary to make progress in developing the full potential of Service-Oriented Computing. The Semantic Web Services Challenge is an open source initiative that provides a public evaluation and certification of multiple frameworks on common industrially-relevant problem sets. This edited volume reports on the first results in developing common understanding of the various technologies intended to facilitate the automation of mediation, choreography and discovery for Web Services using semantic annotations. Semantic Web Services Challenge: Results from the First Year is designed for a professional audience composed of practitioners and researchers in industry. Professionals can use this book to evaluate SWS technology for their potential practical use. The book is also suitable for advanced-level students in computer science.
Effect of ship-stack effluents on cloud reflectivity
NASA Technical Reports Server (NTRS)
Coakley, James A., Jr.; Bernstein, Robert L.; Durkee, Philip A.
1987-01-01
Under stable meteorological conditions the effect of ship-stack exhaust on overlying clouds was detected in daytime satellite images as an enhancement in cloud reflectivity at 3.7 micrometers. The exhaust is a source of cloud-condensation nuclei that increases the number of cloud droplets while reducing droplet size. This reduction in droplet size causes the reflectivity at 3.7 micrometers to be greater than the levels for nearby noncontaminated clouds of similar physical characteristics. The increase in droplet number causes the reflectivity at 0.63 micrometer to be significantly higher for the contaminated clouds despite the likelihood that the exhaust is a source of particles that absorb at visible wavelengths. The effect of aerosols on cloud reflectivity is expected to have a larger influence on the earth's albedo than that due to the direct scattering and absorption of sunlight by the aerosols alone.
Journal of Open Source Software (JOSS): design and first-year review
NASA Astrophysics Data System (ADS)
Smith, Arfon M.
2018-01-01
JOSS is a free and open-access journal that publishes articles describing research software across all disciplines. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. JOSS published more than 100 articles in its first year, many from the scientific python ecosystem (including a number of articles related to astronomy and astrophysics). JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.In this presentation, I'll describes the motivation, design, and progress of the Journal of Open Source Software (JOSS) and how it compares to other avenues for publishing research software in astronomy.
Methodology and Software for Gross Defect Detection of Spent Nuclear Fuel at the Atucha-I Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sitaraman, Shivakumar; Ham, Young S.; Gharibyan, Narek
At the Atucha-I pressurized heavy water reactor in Argentina, fuel assemblies in the spent fuel pools are stored by suspending them in two vertically stacked layers. This introduces the unique problem of verifying the presence of fuel in either layer without physically moving the fuel assemblies. Since much of the fuel is very old, Cerenkov viewing devices are often not very useful even for the top layer. Given that the facility uses both natural uranium and slightly enriched uranium at 0.85 w% {sup 235}U, and has been in operation since 1974, a wide range of burnups and cooling times canmore » exist in any given pool. A spent fuel neutron counting tool consisting of a fission chamber, SFNC, has been used at the site to verify the presence of fuel up to burnups of 8000 MWd/t. At higher discharge burnups to levels up 11,000 MWd/t, the existing signal processing software of the tool was found to fail due to non-linearity of the source term with burnup. A new Graphical User Interface software package based on the LabVIEW platform was developed to predict expected neutron signals covering all ranges of burnups and cooling times and establish maps of expected signals at various pool locations. The algorithm employed in the software uses a set of transfer functions in a 47-energy group structure which are coupled with a 47-energy group neutron source spectrum based on various cooling times and burnups for each of the two enrichment levels. The database of the software consists of these transfer functions for the three different inter-assembly pitches that the fuel is stored in at the site. The transfer functions were developed for a 6 by 6 matrix of fuel assemblies with the detector placed at the center surrounded by four near neighbors, eight next nearest neighbors and so on for the 36 assemblies. These calculations were performed using Monte Carlo radiation transport methods. The basic methodology consisted of starting sources in each of the assemblies and tallying the contribution to the detector by a single neutron in each of the 47 energy groups used. Thus for the single existing symmetric pitch in the pools, where the vertical and horizontal separations are equal, only 6 sets of transfer functions are required. For the two asymmetrical pitches, nine sets of transfer functions are stored. In addition, source spectra at burnups ranging from 4000 to 20000 MWd/t and cooling times up to 40 years are stored. These source terms were established based on CANDU 37-rod fuel that is very similar to the Atucha fuel. Linear interpolation is used by the software for both burnup and cooling time to establish source terms at any intermediate condition. Using the burnup, cooling time and initial enrichment of the surrounding assemblies a set of source strengths in the 47-group structure for each of the 36 assemblies is established and multiplied group-wise with the appropriate transfer function set. The grand total over the 47 groups for all 36 assemblies is the predicted signal at the detector. The software was initially calibrated against a set of typically 5-6 measurements chosen from among the measured data at each level of the six pools and calibration factors were established. The set used for calibration is chosen such that it is fairly representative of the range of spent fuel assembly characteristics present in each level. Once established, these calibration factors can be repeatedly used for verification purposes. Recalibration will be required if the hardware or pool configurations has changed. It will also be required if a long enough time has elapsed since they were established thus making a cooling time correction necessary. The objective of the inspection is to detect missing fuel from one or more nearest neighbors of the detector. During the verification mode of the software, the predicted and measured signals are compared and the inspector is alerted if the difference between the two signals is beyond a set tolerance limit. Based on the uncertainties associated with both the calculations and measurements, a lower limit of the tolerance will be 15% with an upper limit of 20%. For the most part a 20% tolerance limit will be able to detect a missing assembly since in the vast majority of cases the drop in signal due to a single missing nearest neighbor assembly will be in the range 24-27%. The software was benchmarked against an extensive set of measured data taken at the site in 2004. Overall, 326 data points were examined and the prediction of the calibrated software was compared to the measurements within a set tolerance of ±20%. Of these, 283 of the predicted signals representing 87% of the total matched the measured data within ±10%. A further 27 or 8% were in the range of ±10-15% and 8 or 2.5% were in the range of ±15-20%. Thus, 97.5% of the data matched the measurements within the set tolerance limit of 20%, with 95% matching measured data with the lowest allowed tolerance limit of ±15%. The remaining 2.5% had measured signals that were very different from those at locations with very similar surrounding assemblies and the cause of these discrepancies could not be ascertained from the measurement logs. In summary, 97.5% of the predictions matched the measurements within the set 20% tolerance limit providing proof of the robustness of the software. This software package linked to SFNC will be deployed at the site and will enhance the capability of gross defect verification for the whole range of burnup, cooling time and initial enrichments of the spent fuel being discharged into the various pools at the Atucha-I reactor site.« less
Second Harmonic Generation Imaging Analysis of Collagen Arrangement in Human Cornea.
Park, Choul Yong; Lee, Jimmy K; Chuck, Roy S
2015-08-01
To describe the horizontal arrangement of human corneal collagen bundles by using second harmonic generation (SHG) imaging. Human corneas were imaged with an inverted two photon excitation fluorescence microscope. The excitation laser (Ti:Sapphire) was tuned to 850 nm. Backscatter signals of SHG were collected through a 425/30-nm bandpass emission filter. Multiple, consecutive, and overlapping image stacks (z-stacks) were acquired to generate three dimensional data sets. ImageJ software was used to analyze the arrangement pattern (irregularity) of collagen bundles at each image plane. Collagen bundles in the corneal lamellae demonstrated a complex layout merging and splitting within a single lamellar plane. The patterns were significantly different in the superficial and limbal cornea when compared with deep and central regions. Collagen bundles were smaller in the superficial layer and larger in deep lamellae. By using SHG imaging, the horizontal arrangement of corneal collagen bundles was elucidated at different depths and focal regions of the human cornea.
Enabling IPv6 at FZU - WLCG Tier2 in Prague
NASA Astrophysics Data System (ADS)
Kouba, Tomáš; Chudoba, Jiří; Eliáš, Marek
2014-06-01
The usage of the new IPv6 protocol in production is becoming reality in the HEP community and the Computing Centre of the Institute of Physics in Prague participates in many IPv6 related activities. Our contribution presents experience with monitoring in HEPiX distributed IPv6 testbed which includes 11 remote sites. We use Nagios to check availability of services and Smokeping for monitoring the network latency. Since it is not always trivial to setup DNS in a dual stack environment properly, we developed a Nagios plugin for checking whether a domain name is resolvable when using only IP protocol version 6 and only version 4. We will also present local area network monitoring and tuning related to IPv6 performance. One of the most important software for a grid site is a batch system for a job execution. We will present our experience with configuring and running Torque batch system in a dual stack environment. We also discuss the steps needed to run VO specific jobs in our IPv6 testbed.
Managing Digital Archives Using Open Source Software Tools
NASA Astrophysics Data System (ADS)
Barve, S.; Dongare, S.
2007-10-01
This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.
The role of open-source software in innovation and standardization in radiology.
Erickson, Bradley J; Langer, Steve; Nagy, Paul
2005-11-01
The use of open-source software (OSS), in which developers release the source code to applications they have developed, is popular in the software industry. This is done to allow others to modify and improve software (which may or may not be shared back to the community) and to allow others to learn from the software. Radiology was an early participant in this model, supporting OSS that implemented the ACR-National Electrical Manufacturers Association (now Digital Imaging and Communications in Medicine) standard for medical image communications. In radiology and in other fields, OSS has promoted innovation and the adoption of standards. Popular OSS is of high quality because access to source code allows many people to identify and resolve errors. Open-source software is analogous to the peer-review scientific process: one must be able to see and reproduce results to understand and promote what is shared. The authors emphasize that support for OSS need not threaten vendors; most vendors embrace and benefit from standards. Open-source development does not replace vendors but more clearly defines their roles, typically focusing on areas in which proprietary differentiators benefit customers and on professional services such as implementation planning and service. Continued support for OSS is essential for the success of our field.
NASA Astrophysics Data System (ADS)
Mayorga, E.
2013-12-01
Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such tools can provide the additional advantage of enhancing cohesion and communication across specific research areas, and reducing research obstacles in a range of disciplines.
Authorship Attribution of Source Code
ERIC Educational Resources Information Center
Tennyson, Matthew F.
2013-01-01
Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…
Open Source Software Development
2011-01-01
Software, 2002, 149(1), 3-17. 3. DiBona , C., Cooper, D., and Stone, M. (Eds.), Open Sources 2.0, 2005, O’Reilly Media, Sebastopol, CA. Also see, C... DiBona , S. Ockman, and M. Stone (Eds.). Open Sources: Vocides from the Open Source Revolution, 1999. O’Reilly Media, Sebastopol, CA. 4. Ducheneaut, N
Four simple recommendations to encourage best practices in research software
Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965
Four simple recommendations to encourage best practices in research software.
Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.
Open Source Software and the Intellectual Commons.
ERIC Educational Resources Information Center
Dorman, David
2002-01-01
Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…
Improving Software Sustainability: Lessons Learned from Profiles in Science.
Gallagher, Marie E
2013-01-01
The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment. Sometimes these changes happen on short notice, so we continually monitor our library's software for signs of endangerment. We have attempted to replace proprietary software with suitable in-house or open source software. When the replacement involves a standalone piece of software with a nearly equivalent version, such as replacing a commercial HTTP server with an open source HTTP server, the replacement is straightforward. Recently we replaced software that functioned not only as our search engine but also as the backbone of the architecture of our Web site. In this paper, we describe the lessons learned and the pros and cons of replacing this software with open source software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Yeong-Shyung; Choi, Jung-Pyung; Stevenson, Jeffry W.
In addition to developing passive means for Cr mitigation via coatings, Pacific Northwest National Laboratory has teamed up with the University of Connecticut to adopt an active approach by employing a novel Cr-getter material in the system. In this work, validation of the novel Cr-getter was conducted using cells in a generic stack test condition with humidified air and coated metallic interconnect. Two Cr-getter locations were investigated: one upstream and one “on cell.” Pre-oxidized AISI 441 metal stripes were used as Cr source. Three single cell tests were conducted at 800oC in constant current mode for 1000h with periodic stopsmore » for measurement of impedance and IV: a baseline cell, a cell with Cr source and getter, and a cell with Cr source but no getter. Results showed that the cell with Cr-getter degraded much slower (11.5% kh-1) than the baseline (15.3% kh-1) and the cell without the getter (56% kh-1).« less
Use of Continuous Integration Tools for Application Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vergara Larrea, Veronica G; Joubert, Wayne; Fuson, Christopher B
High performance computing systems are becom- ing increasingly complex, both in node architecture and in the multiple layers of software stack required to compile and run applications. As a consequence, the likelihood is increasing for application performance regressions to occur as a result of routine upgrades of system software components which interact in complex ways. The purpose of this study is to evaluate the effectiveness of continuous integration tools for application performance monitoring on HPC systems. In addition, this paper also describes a prototype system for application perfor- mance monitoring based on Jenkins, a Java-based continuous integration tool. The monitoringmore » system described leverages several features in Jenkins to track application performance results over time. Preliminary results and lessons learned from monitoring applications on Cray systems at the Oak Ridge Leadership Computing Facility are presented.« less
NASA Astrophysics Data System (ADS)
McLaurin, Melvin Barker
2007-12-01
The group-III nitrides exhibit significant spontaneous and piezoelectric polarization parallel to the [0001] direction, which are manifested as sheet charges at heterointerfaces. While polarization can be used to engineer the band-structure of a device, internal electric fields generated by polarization discontinuities can also have a number of negative consequences for the performance and design of structures utilizing heterojunctions. The most direct route to polarization free group-III nitride devices is growth on either one of the "non-polar" prismatic faces of the crystal (m-plane (1010) or a-plane (1120)) where the [0001] direction lies in the plane of any heterointerfaces. This dissertation focuses on the growth of non-polar and semi-polar GaN by MBE and on how the dominant feature of the defect structure of non-polar and semi-polar films, basal plane stacking faults, determines the properties of the reciprocal lattice and electrical transport of the films. The first part is a survey of the MBE growth of the two non-polar planes (10 10) and (1120) and three semi-polar planes (1011), (1013) and {11 22} investigated in this work. The relationship between basal plane stacking faults and broadening of the reciprocal lattice is discussed and measured with X-ray diffraction using a lateral-variant of the Williamson-Hall analysis. The electrical properties of m-plane films are investigated using Hall-effect and TLM measurements. Anisotropic mobilities were observed for both electrons and holes along with record p-type conductivities and hole concentrations. By comparison to both inversion-domain free c-plane films and stacking-fault-free free-standing m-plane GaN wafers it was determined that basal plane stacking faults were the source of both the enhanced p-type conductivity and the anisotropic carrier mobilities. Finally, we propose a possible source of anisotropic mobilities and enhanced p-type conduction in faulted films is proposed. Basal plane stacking faults are treated as heterostructures of the wurtzite and zincblende polytypes of GaN. The band parameter and polarization differences between the polytypes result in large offsets in both the conduction and valence band edges at the stacking faults. Anisotropy results from scattering from the band-edge offsets and enhanced mobility from screening due to charge accumulation at these band edge offsets.
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
BioContainers: an open-source and community-driven framework for software standardization.
da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset
2017-08-15
BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.
BioContainers: an open-source and community-driven framework for software standardization
da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset
2017-01-01
Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.
Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
Note: Tormenta: An open source Python-powered control software for camera based optical microscopy
NASA Astrophysics Data System (ADS)
Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.
2016-12-01
Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.
The NIH 3D Print Exchange: A Public Resource for Bioscientific and Biomedical 3D Prints.
Coakley, Meghan F; Hurt, Darrell E; Weber, Nick; Mtingwa, Makazi; Fincher, Erin C; Alekseyev, Vsevelod; Chen, David T; Yun, Alvin; Gizaw, Metasebia; Swan, Jeremy; Yoo, Terry S; Huyen, Yentram
2014-09-01
The National Institutes of Health (NIH) has launched the NIH 3D Print Exchange, an online portal for discovering and creating bioscientifically relevant 3D models suitable for 3D printing, to provide both researchers and educators with a trusted source to discover accurate and informative models. There are a number of online resources for 3D prints, but there is a paucity of scientific models, and the expertise required to generate and validate such models remains a barrier. The NIH 3D Print Exchange fills this gap by providing novel, web-based tools that empower users with the ability to create ready-to-print 3D files from molecular structure data, microscopy image stacks, and computed tomography scan data. The NIH 3D Print Exchange facilitates open data sharing in a community-driven environment, and also includes various interactive features, as well as information and tutorials on 3D modeling software. As the first government-sponsored website dedicated to 3D printing, the NIH 3D Print Exchange is an important step forward to bringing 3D printing to the mainstream for scientific research and education.
Plasma boundary shape control and real-time equilibrium reconstruction on NSTX-U
Boyer, M. D.; Battaglia, D. J.; Mueller, D.; ...
2018-01-25
Here, the upgrade to the National Spherical Torus eXperiment (NSTX-U) included two main improvements: a larger center-stack, enabling higher toroidal field and longer pulse duration, and the addition of three new tangentially aimed neutral beam sources, which increase available heating and current drive, and allow for flexibility in shaping power, torque, current, and particle deposition profiles. To best use these new capabilities and meet the high-performance operational goals of NSTX-U, major upgrades to the NSTX-U control system (NCS) hardware and software have been made. Several control algorithms, including those used for real-time equilibrium reconstruction and shape control, have been upgradedmore » to improve and extend plasma control capabilities. As part of the commissioning phase of first plasma operations, the shape control system was tuned to control the boundary in both inner-wall limited and diverted discharges. It has been used to accurately track the requested evolution of the boundary (including the size of the inner gap between the plasma and central solenoid, which is a challenge for the ST configuration), X-point locations, and strike point locations, enabling repeatable discharge evolutions for scenario development and diagnostic commissioning.« less
Plasma boundary shape control and real-time equilibrium reconstruction on NSTX-U
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyer, M. D.; Battaglia, D. J.; Mueller, D.
Here, the upgrade to the National Spherical Torus eXperiment (NSTX-U) included two main improvements: a larger center-stack, enabling higher toroidal field and longer pulse duration, and the addition of three new tangentially aimed neutral beam sources, which increase available heating and current drive, and allow for flexibility in shaping power, torque, current, and particle deposition profiles. To best use these new capabilities and meet the high-performance operational goals of NSTX-U, major upgrades to the NSTX-U control system (NCS) hardware and software have been made. Several control algorithms, including those used for real-time equilibrium reconstruction and shape control, have been upgradedmore » to improve and extend plasma control capabilities. As part of the commissioning phase of first plasma operations, the shape control system was tuned to control the boundary in both inner-wall limited and diverted discharges. It has been used to accurately track the requested evolution of the boundary (including the size of the inner gap between the plasma and central solenoid, which is a challenge for the ST configuration), X-point locations, and strike point locations, enabling repeatable discharge evolutions for scenario development and diagnostic commissioning.« less
Plasma boundary shape control and real-time equilibrium reconstruction on NSTX-U
NASA Astrophysics Data System (ADS)
Boyer, M. D.; Battaglia, D. J.; Mueller, D.; Eidietis, N.; Erickson, K.; Ferron, J.; Gates, D. A.; Gerhardt, S.; Johnson, R.; Kolemen, E.; Menard, J.; Myers, C. E.; Sabbagh, S. A.; Scotti, F.; Vail, P.
2018-03-01
The upgrade to the National Spherical Torus eXperiment (NSTX-U) included two main improvements: a larger center-stack, enabling higher toroidal field and longer pulse duration, and the addition of three new tangentially aimed neutral beam sources, which increase available heating and current drive, and allow for flexibility in shaping power, torque, current, and particle deposition profiles. To best use these new capabilities and meet the high-performance operational goals of NSTX-U, major upgrades to the NSTX-U control system (NCS) hardware and software have been made. Several control algorithms, including those used for real-time equilibrium reconstruction and shape control, have been upgraded to improve and extend plasma control capabilities. As part of the commissioning phase of first plasma operations, the shape control system was tuned to control the boundary in both inner-wall limited and diverted discharges. It has been used to accurately track the requested evolution of the boundary (including the size of the inner gap between the plasma and central solenoid, which is a challenge for the ST configuration), X-point locations, and strike point locations, enabling repeatable discharge evolutions for scenario development and diagnostic commissioning.
Supporting Source Code Comprehension during Software Evolution and Maintenance
ERIC Educational Resources Information Center
Alhindawi, Nouh
2013-01-01
This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…
NASA Astrophysics Data System (ADS)
Fortin, W.; Holbrook, W. S.; Mallick, S.; Everson, E. D.; Tobin, H. J.; Keranen, K. M.
2014-12-01
Understanding the geologic composition of the Cascadia Subduction Zone (CSZ) is critically important in assessing seismic hazards in the Pacific Northwest. Despite being a potential earthquake and tsunami threat to millions of people, key details of the structure and fault mechanisms remain poorly understood in the CSZ. In particular, the position and character of the subduction interface remains elusive due to its relative aseismicity and low seismic reflectivity, making imaging difficult for both passive and active source methods. Modern active-source reflection seismic data acquired as part of the COAST project in 2012 provide an opportunity to study the transition from the Cascadia basin, across the deformation front, and into the accretionary prism. Coupled with advances in seismic inversion methods, this new data allow us to produce detailed velocity models of the CSZ and accurate pre-stack depth migrations for studying geologic structure. While still computationally expensive, current computing clusters can perform seismic inversions at resolutions that match that of the seismic image itself. Here we present pre-stack full waveform inversions of the central seismic line of the COAST survey offshore Washington state. The resultant velocity model is produced by inversion at every CMP location, 6.25 m laterally, with vertical resolution of 0.2 times the dominant seismic frequency. We report a good average correlation value above 0.8 across the entire seismic line, determined by comparing synthetic gathers to the real pre-stack gathers. These detailed velocity models, both Vp and Vs, along with the density model, are a necessary step toward a detailed porosity cross section to be used to determine the role of fluids in the CSZ. Additionally, the P-velocity model is used to produce a pre-stack depth migration image of the CSZ.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-26
... chemical reactions in the atmosphere. The level of impact a new source can have on ozone levels is... condensing outside the stack or through chemical reactions with pollutants already in the atmosphere...
Infrasound Generation from the HH Seismic Hammer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Kyle Richard
2014-10-01
The HH Seismic hammer is a large, "weight-drop" source for active source seismic experiments. This system provides a repetitive source that can be stacked for subsurface imaging and exploration studies. Although the seismic hammer was designed for seismological studies it was surmised that it might produce energy in the infrasonic frequency range due to the ground motion generated by the 13 metric ton drop mass. This study demonstrates that the seismic hammer generates a consistent acoustic source that could be used for in-situ sensor characterization, array evaluation and surface-air coupling studies for source characterization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Annette Rohr
2006-03-01
TERESA (Toxicological Evaluation of Realistic Emissions of Source Aerosols) involves exposing laboratory rats to realistic coal-fired power plant and mobile source emissions to help determine the relative toxicity of these PM sources. There are three coal-fired power plants in the TERESA program; this report describes the results of fieldwork conducted at the first plant, located in the Upper Midwest. The project was technically challenging by virtue of its novel design and requirement for the development of new techniques. By examining aged, atmospherically transformed aerosol derived from power plant stack emissions, we were able to evaluate the toxicity of PM derivedmore » from coal combustion in a manner that more accurately reflects the exposure of concern than existing methodologies. TERESA also involves assessment of actual plant emissions in a field setting--an important strength since it reduces the question of representativeness of emissions. A sampling system was developed and assembled to draw emissions from the stack; stack sampling conducted according to standard EPA protocol suggested that the sampled emissions are representative of those exiting the stack into the atmosphere. Two mobile laboratories were then outfitted for the study: (1) a chemical laboratory in which the atmospheric aging was conducted and which housed the bulk of the analytical equipment; and (2) a toxicological laboratory, which contained animal caging and the exposure apparatus. Animal exposures were carried out from May-November 2004 to a number of simulated atmospheric scenarios. Toxicological endpoints included (1) pulmonary function and breathing pattern; (2) bronchoalveolar lavage fluid cytological and biochemical analyses; (3) blood cytological analyses; (4) in vivo oxidative stress in heart and lung tissue; and (5) heart and lung histopathology. Results indicated no differences between exposed and control animals in any of the endpoints examined. Exposure concentrations for the scenarios utilizing secondary particles (oxidized emissions) ranged from 70-256 {micro}g/m{sup 3}, and some of the atmospheres contained high acidity levels (up to 49 {micro}g/m{sup 3} equivalent of sulfuric acid). However, caution must be used in generalizing these results to other power plants utilizing different coal types and with different plant configurations, as the emissions may vary based on these factors.« less
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
Reconstituted Three-Dimensional Interactive Imaging
NASA Technical Reports Server (NTRS)
Hamilton, Joseph; Foley, Theodore; Duncavage, Thomas; Mayes, Terrence
2010-01-01
A method combines two-dimensional images, enhancing the images as well as rendering a 3D, enhanced, interactive computer image or visual model. Any advanced compiler can be used in conjunction with any graphics library package for this method, which is intended to take digitized images and virtually stack them so that they can be interactively viewed as a set of slices. This innovation can take multiple image sources (film or digital) and create a "transparent" image with higher densities in the image being less transparent. The images are then stacked such that an apparent 3D object is created in virtual space for interactive review of the set of images. This innovation can be used with any application where 3D images are taken as slices of a larger object. These could include machines, materials for inspection, geological objects, or human scanning. Illuminous values were stacked into planes with different transparency levels of tissues. These transparency levels can use multiple energy levels, such as density of CT scans or radioactive density. A desktop computer with enough video memory to produce the image is capable of this work. The memory changes with the size and resolution of the desired images to be stacked and viewed.
Bae, Daeryeong; Kim, Shino; Lee, Wonoh; Yi, Jin Woo; Um, Moon Kwang; Seong, Dong Gi
2018-05-21
A fast-cure carbon fiber/epoxy prepreg was thermoformed against a replicated automotive roof panel mold (square-cup) to investigate the effect of the stacking sequence of prepreg layers with unidirectional and plane woven fabrics and mold geometry with different drawing angles and depths on the fiber deformation and formability of the prepreg. The optimum forming condition was determined via analysis of the material properties of epoxy resin. The non-linear mechanical properties of prepreg at the deformation modes of inter- and intra-ply shear, tensile and bending were measured to be used as input data for the commercial virtual forming simulation software. The prepreg with a stacking sequence containing the plain-woven carbon prepreg on the outer layer of the laminate was successfully thermoformed against a mold with a depth of 20 mm and a tilting angle of 110°. Experimental results for the shear deformations at each corner of the thermoformed square-cup product were compared with the simulation and a similarity in the overall tendency of the shear angle in the path at each corner was observed. The results are expected to contribute to the optimization of parameters on materials, mold design and processing in the thermoforming mass-production process for manufacturing high quality automotive parts with a square-cup geometry.
Bae, Daeryeong; Kim, Shino; Lee, Wonoh; Yi, Jin Woo; Um, Moon Kwang; Seong, Dong Gi
2018-01-01
A fast-cure carbon fiber/epoxy prepreg was thermoformed against a replicated automotive roof panel mold (square-cup) to investigate the effect of the stacking sequence of prepreg layers with unidirectional and plane woven fabrics and mold geometry with different drawing angles and depths on the fiber deformation and formability of the prepreg. The optimum forming condition was determined via analysis of the material properties of epoxy resin. The non-linear mechanical properties of prepreg at the deformation modes of inter- and intra-ply shear, tensile and bending were measured to be used as input data for the commercial virtual forming simulation software. The prepreg with a stacking sequence containing the plain-woven carbon prepreg on the outer layer of the laminate was successfully thermoformed against a mold with a depth of 20 mm and a tilting angle of 110°. Experimental results for the shear deformations at each corner of the thermoformed square-cup product were compared with the simulation and a similarity in the overall tendency of the shear angle in the path at each corner was observed. The results are expected to contribute to the optimization of parameters on materials, mold design and processing in the thermoforming mass-production process for manufacturing high quality automotive parts with a square-cup geometry. PMID:29883413
2014-01-01
Background Logos are commonly used in molecular biology to provide a compact graphical representation of the conservation pattern of a set of sequences. They render the information contained in sequence alignments or profile hidden Markov models by drawing a stack of letters for each position, where the height of the stack corresponds to the conservation at that position, and the height of each letter within a stack depends on the frequency of that letter at that position. Results We present a new tool and web server, called Skylign, which provides a unified framework for creating logos for both sequence alignments and profile hidden Markov models. In addition to static image files, Skylign creates a novel interactive logo plot for inclusion in web pages. These interactive logos enable scrolling, zooming, and inspection of underlying values. Skylign can avoid sampling bias in sequence alignments by down-weighting redundant sequences and by combining observed counts with informed priors. It also simplifies the representation of gap parameters, and can optionally scale letter heights based on alternate calculations of the conservation of a position. Conclusion Skylign is available as a website, a scriptable web service with a RESTful interface, and as a software package for download. Skylign’s interactive logos are easily incorporated into a web page with just a few lines of HTML markup. Skylign may be found at http://skylign.org. PMID:24410852
NASA Astrophysics Data System (ADS)
Chen, X.; Abercrombie, R. E.; Pennington, C.
2017-12-01
Recorded seismic waveforms include contributions from earthquake source properties and propagation effects, leading to long-standing trade-off problems between site/path effects and source effects. With near-field recordings, the path effect is relatively small, so the trade-off problem can be simplified to between source and site effects (commonly referred as "kappa value"). This problem is especially significant for small earthquakes where the corner frequencies are within similar ranges of kappa values, so direct spectrum fitting often leads to systematic biases due to corner frequency and magnitude. In response to the significantly increased seismicity rate in Oklahoma, several local networks have been deployed following major earthquakes: the Prague, Pawnee and Fairview earthquakes. Each network provides dense observations within 20 km surrounding the fault zone, recording tens of thousands of aftershocks between M1 to M3. Using near-field recordings in the Prague area, we apply a stacking approach to separate path/site and source effects. The resulting source parameters are consistent with parameters derived from ground motion and spectral ratio methods from other studies; they exhibit spatial coherence within the fault zone for different fault patches. We apply these source parameter constraints in an analysis of kappa values for stations within 20 km of the fault zone. The resulting kappa values show significantly reduced variability compared to those from direct spectral fitting without constraints on the source spectrum; they are not biased by earthquake magnitudes. With these improvements, we plan to apply the stacking analysis to other local arrays to analyze source properties and site characteristics. For selected individual earthquakes, we will also use individual-pair empirical Green's function (EGF) analysis to validate the source parameter estimations.
NASA Astrophysics Data System (ADS)
Zelt, C. A.
2017-12-01
Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site, ideally maintained by someone in a funded position. Perhaps the biggest challenge is the reality that researches who use software, as opposed to develop software, are more attractive university hires because they are more likely to be "big picture" scientists that publish in the highest profile journals, although sometimes the two go together.
Identifying impact of software dependencies on replicability of biomedical workflows.
Miksa, Tomasz; Rauber, Andreas; Mina, Eleni
2016-12-01
Complex data driven experiments form the basis of biomedical research. Recent findings warn that the context in which the software is run, that is the infrastructure and the third party dependencies, can have a crucial impact on the final results delivered by a computational experiment. This implies that in order to replicate the same result, not only the same data must be used, but also it must be run on an equivalent software stack. In this paper we present the VFramework that enables assessing replicability of workflows. It identifies whether any differences in software dependencies among two executions of the same workflow exist and whether they have impact on the produced results. We also conduct a case study in which we investigate the impact of software dependencies on replicability of Taverna workflows used in biomedical research of Huntington's disease. We re-execute analysed workflows in environments differing in operating system distribution and configuration. The results show that the VFramework can be used to identify the impact of software dependencies on the replicability of biomedical workflows. Furthermore, we observe that despite the fact that the workflows are executed in a controlled environment, they still depend on specific tools installed in the environment. The context model used by the VFramework improves the deficiencies of provenance traces and documents also such tools. Based on our findings we define guidelines for workflow owners that enable them to improve replicability of their workflows. Copyright © 2016 Elsevier Inc. All rights reserved.
Analysis of e-beam impact on the resist stack in e-beam lithography process
NASA Astrophysics Data System (ADS)
Indykeiwicz, K.; Paszkiewicz, B.
2013-07-01
Paper presents research on the sub-micron gate, AlGaN /GaN HEMT type transistors, fabrication by e-beam lithography and lift-off technique. The impact of the electron beam on the resists layer and the substrate was analyzed by MC method in Casino v3.2 software. The influence of technological process parameters on the metal structures resolution and quality for paths 100 nm, 300 nm and 500 nm wide and 20 μm long was studied. Qualitative simulation correspondences to the conducted experiments were obtained.