OPM: The Open Porous Media Initiative
NASA Astrophysics Data System (ADS)
Flemisch, B.; Flornes, K. M.; Lie, K.; Rasmussen, A.
2011-12-01
The principal objective of the Open Porous Media (OPM) initiative is to develop a simulation suite that is capable of modeling industrially and scientifically relevant flow and transport processes in porous media and bridge the gap between the different application areas of porous media modeling, including reservoir mechanics, CO2 sequestration, biological systems, and product development of engineered media. The OPM initiative will provide a long-lasting, efficient, and well-maintained open-source software for flow and transport in porous media built on modern software principles. The suite is released under the GNU General Public License (GPL). Our motivation is to provide a means to unite industry and public research on simulation of flow and transport in porous media. For academic users, we seek to provide a software infrastructure that facilitates testing of new ideas on models with industry-standard complexity, while at the same time giving the researcher control over discretization and solvers. Similarly, we aim to accelerate the technology transfer from academic institutions to professional companies by making new research results available as free software of professional standard. The OPM initiative is currently supported by six research groups in Norway and Germany and funded by existing grants from public research agencies as well as from Statoil Petroleum and Total E&P Norge. However, a full-scale development of the OPM initiative requires substantially more funding and involvement of more research groups and potential end users. In this talk, we will provide an overview of the current activities in the OPM initiative. Special emphasis will be given to the demonstration of the synergies achieved by combining the strengths of individual open-source software components. In particular, a new fully implicit solver developed within the DUNE-based simulator DuMux could be enhanced by the ability to read industry-standard Eclipse input files and to run on grids given in corner-point format. Examples taken from the SPE comparative solution projects and CO2 sequestration benchmarks illustrate the current capabilities of the simulation suite.
NASA Technical Reports Server (NTRS)
Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim
2012-01-01
Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.
Subsurface Transport Over Multiple Phases Demonstration Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-01-05
The STOMP simulator is a suite of numerical simulators developed by Pacific Northwest National Laboratory for addressing problems involving coupled multifluid hydrologic, thermal, geochemical, and geomechanical processes in the subsurface. The simulator has been applied to problems concerning environmental remediation, environmental stewardship, carbon sequestration, conventional petroleum production, and the production of unconventional hydrocarbon fuels. The simulator is copyrighted by Battelle Memorial Institute, and is available outside of PNNL via use agreements. To promote the open exchange of scientific ideas the simulator is provided as source code. A demonstration version of the simulator has been developed, which will provide potential newmore » users with an executable (not source code) implementation of the software royalty free. Demonstration versions will be offered via the STOMP website for all currently available operational modes of the simulator. The demonstration versions of the simulator will be configured with the direct banded linear system solver and have a limit of 1,000 active grid cells. This will provide potential new users with an opportunity to apply the code to simple problems, including many of the STOMP short course problems, without having to pay a license fee. Users will be required to register on the STOMP website prior to receiving an executable.« less
A Full-Featured User Friendly CO 2-EOR and Sequestration Planning Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savage, Bill
A Full-Featured, User Friendly CO 2-EOR and Sequestration Planning Software This project addressed the development of an integrated software solution that includes a graphical user interface, numerical simulation, visualization tools and optimization processes for reservoir simulation modeling of CO 2-EOR. The objective was to assist the industry in the development of domestic energy resources by expanding the application of CO 2-EOR technologies, and ultimately to maximize the CO 2} sequestration capacity of the U.S. The software resulted in a field-ready application for the industry to address the current CO 2-EOR technologies. The software has been made available to the publicmore » without restrictions and with user friendly operating documentation and tutorials. The software (executable only) can be downloaded from NITEC’s website at www.nitecllc.com. This integrated solution enables the design, optimization and operation of CO 2-EOR processes for small and mid-sized operators, who currently cannot afford the expensive, time intensive solutions that the major oil companies enjoy. Based on one estimate, small oil fields comprise 30% of the of total economic resource potential for the application of CO 2-EOR processes in the U.S. This corresponds to 21.7 billion barrels of incremental, technically recoverable oil using the current “best practices”, and 31.9 billion barrels using “next-generation” CO 2-EOR techniques. The project included a Case Study of a prospective CO 2-EOR candidate field in Wyoming by a small independent, Linc Energy Petroleum Wyoming, Inc. NITEC LLC has an established track record of developing innovative and user friendly software. The Principle Investigator is an experienced manager and engineer with expertise in software development, numerical techniques, and GUI applications. Unique, presently-proprietary NITEC technologies have been integrated into this application to further its ease of use and technical functionality.« less
Switchgrass cultivars alter microbial contribution to deep soil C
USDA-ARS?s Scientific Manuscript database
Switchgrass (Panicum virgatum L.) is a perennial, cellulosic biofuel feedstock capable of growing under a wide variety of climatic conditions on land marginally suited to cultivated crops. Due to its perennial nature and deep rooting characteristics, switchgrass contributes to soil C sequestration ...
NASA Astrophysics Data System (ADS)
Pop, P. P.; Pop-Vadean, A.; Barz, C.; Latinovic, T.
2017-01-01
In this article we will present a transdisciplinary approach to carbon sequestration in agricultural soils. The software provides a method proposed to measure the amount of carbon that can be captured from different soil types and different crop. The application has integrated an intuitive interface, is portable and calculate the number of green certificates as a reward for farmers financial support for environmental protection. We plan to initiate a scientific approach to environmental protection through financial incentives for agriculture fits in EU rules by taxing big polluters and rewarding those who maintain a suitable environment for the development of ecological and competitive agriculture.
i-Tree: Tools to assess and manage structure, function, and value of community forests
NASA Astrophysics Data System (ADS)
Hirabayashi, S.; Nowak, D.; Endreny, T. A.; Kroll, C.; Maco, S.
2011-12-01
Trees in urban communities can mitigate many adverse effects associated with anthropogenic activities and climate change (e.g. urban heat island, greenhouse gas, air pollution, and floods). To protect environmental and human health, managers need to make informed decisions regarding urban forest management practices. Here we present the i-Tree suite of software tools (www.itreetools.org) developed by the USDA Forest Service and their cooperators. This software suite can help urban forest managers assess and manage the structure, function, and value of urban tree populations regardless of community size or technical capacity. i-Tree is a state-of-the-art, peer-reviewed Windows GUI- or Web-based software that is freely available, supported, and continuously refined by the USDA Forest Service and their cooperators. Two major features of i-Tree are 1) to analyze current canopy structures and identify potential planting spots, and 2) to estimate the environmental benefits provided by the trees, such as carbon storage and sequestration, energy conservation, air pollution removal, and storm water reduction. To cover diverse forest topologies, various tools were developed within the i-Tree suite: i-Tree Design for points (individual trees), i-Tree Streets for lines (street trees), and i-Tree Eco, Vue, and Canopy (in the order of complexity) for areas (community trees). Once the forest structure is identified with these tools, ecosystem services provided by trees can be estimated with common models and protocols, and reports in the form of texts, charts, and figures are then created for users. Since i-Tree was developed with a client/server architecture, nationwide data in the US such as location-related parameters, weather, streamflow, and air pollution data are stored in the server and retrieved to a user's computer at run-time. Freely available remote-sensed images (e.g. NLCD and Google maps) are also employed to estimate tree canopy characteristics. As the demand for i-Tree grows internationally, environmental databases from more countries will be coupled with the software suite. Two more i-Tree applications, i-Tree Forecast and i-Tree Landscape are now under development. i-Tree Forecast simulates canopy structures for up to 100 years based on planting and mortality rates and adds capabilities for other i-Tree applications to estimate the benefits of future canopy scenarios. While most i-Tree applications employ a spatially lumped approach, i-Tree landscape employs a spatially distributed approach that allows users to map changes in canopy cover and ecosystem services through time and space. These new i-Tree tools provide an advanced platform for urban managers to assess the impact of current and future urban forests. i-Tree allows managers to promote effective urban forest management and sound arboricultural practices by providing information for advocacy and planning, baseline data for making informed decisions, and standardization for comparisons with other communities.
NASA Technical Reports Server (NTRS)
Brown, Charles; Andrew, Robert; Roe, Scott; Frye, Ronald; Harvey, Michael; Vu, Tuan; Balachandran, Krishnaiyer; Bly, Ben
2012-01-01
The Ascent/Descent Software Suite has been used to support a variety of NASA Shuttle Program mission planning and analysis activities, such as range safety, on the Integrated Planning System (IPS) platform. The Ascent/Descent Software Suite, containing Ascent Flight Design (ASC)/Descent Flight Design (DESC) Configuration items (Cis), lifecycle documents, and data files used for shuttle ascent and entry modeling analysis and mission design, resides on IPS/Linux workstations. A list of tools in Navigation (NAV)/Prop Software Suite represents tool versions established during or after the IPS Equipment Rehost-3 project.
Navigation/Prop Software Suite
NASA Technical Reports Server (NTRS)
Bruchmiller, Tomas; Tran, Sanh; Lee, Mathew; Bucker, Scott; Bupane, Catherine; Bennett, Charles; Cantu, Sergio; Kwong, Ping; Propst, Carolyn
2012-01-01
Navigation (Nav)/Prop software is used to support shuttle mission analysis, production, and some operations tasks. The Nav/Prop suite containing configuration items (CIs) resides on IPS/Linux workstations. It features lifecycle documents, and data files used for shuttle navigation and propellant analysis for all flight segments. This suite also includes trajectory server, archive server, and RAT software residing on MCC/Linux workstations. Navigation/Prop represents tool versions established during or after IPS Equipment Rehost-3 or after the MCC Rehost.
Center for Efficient Exascale Discretizations Software Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolev, Tzanio; Dobrev, Veselin; Tomov, Vladimir
The CEED Software suite is a collection of generally applicable software tools focusing on the following computational motives: PDE discretizations on unstructured meshes, high-order finite element and spectral element methods and unstructured adaptive mesh refinement. All of this software is being developed as part of CEED, a co-design Center for Efficient Exascale Discretizations, within DOE's Exascale Computing Project (ECP) program.
NASA Astrophysics Data System (ADS)
Burba, George; Madsen, Rod; Feese, Kristin
2013-04-01
The Eddy Covariance method is a micrometeorological technique for direct high-speed measurements of the transport of gases, heat, and momentum between the earth's surface and the atmosphere. Gas fluxes, emission and exchange rates are carefully characterized from single-point in-situ measurements using permanent or mobile towers, or moving platforms such as automobiles, helicopters, airplanes, etc. Since the early 1990s, this technique has been widely used by micrometeorologists across the globe for quantifying CO2 emission rates from various natural, urban and agricultural ecosystems [1,2], including areas of agricultural carbon sequestration. Presently, over 600 eddy covariance stations are in operation in over 120 countries. In the last 3-5 years, advancements in instrumentation and software have reached the point when they can be effectively used outside the area of micrometeorology, and can prove valuable for geological carbon capture and sequestration, landfill emission measurements, high-precision agriculture and other non-micrometeorological industrial and regulatory applications. In the field of geological carbon capture and sequestration, the magnitude of CO2 seepage fluxes depends on a variety of factors. Emerging projects utilize eddy covariance measurement to monitor large areas where CO2 may escape from the subsurface, to detect and quantify CO2 leakage, and to assure the efficiency of CO2 geological storage [3,4,5,6,7,8]. Although Eddy Covariance is one of the most direct and defensible ways to measure and calculate turbulent fluxes, the method is mathematically complex, and requires careful setup, execution and data processing tailor-fit to a specific site and a project. With this in mind, step-by-step instructions were created to introduce a novice to the conventional Eddy Covariance technique [9], and to assist in further understanding the method through more advanced references such as graduate-level textbooks, flux networks guidelines, journals and technical papers. A free open-source software package with a user-friendly interface was developed accordingly for computing final fully corrected CO2 emission numbers [10]. The presentation covers highlights of the eddy covariance method, its application to geological carbon sequestration, key requirements, instrumentation and software, and reviews educational resources particularly useful for carbon sequestration research. References: [1] Aubinet, M., T. Vesala, and D. Papale (Eds.), 2012. Eddy Covariance: A Practical Guide to Measurement and Data Analysis. Springer-Verlag, 442 pp. [2] Foken T., 2008. Micrometeorology. Springer-Verlag, 308 pp. [4] Finley, R., 2009. An Assessment of Geological Carbon Sequestration in the Illinois Basin Overview of the Decatur-Illinois Basin Site. MGSC, http://www.istc.illinois.edu/info/govs_awards_docs/2009-GSA-1100-Finley.pdf [5] Liu, G. (Ed.), 2012. Greenhouse Gases: Capturing, Utilization and Reduction. Intech, 338 pp. [6] LI-COR Biosciences, 2011. Surface Monitoring for Geologic Carbon Sequestration Monitoring: Methods, Instrumentation, and Case Studies. LI-COR Biosciences, Pub. 980-11916, 15 pp. [7] Benson, S., 2006. Monitoring carbon dioxide sequestration in deep geological formations for inventory verification and carbon credits, SPE-102833, Presentation [8] Lewicki, J., G. Hilley, M. Fischer, L. Pan, C. Olden-burg, C. Dobeck, and L. Spangler, 2009.Eddy covariance observations of leakage during shallow subsurface CO2 releases. Journal of Geophys Res, 114: D12302 [9] Burba, G., 2013. Eddy Covariance Method for Scientific, Industrial, Agricultural and Regulatory Applications. LI-COR Biosciences, 328 pp. [10] LI-COR Biosciences, 2012. EddyPro 4.0: Help and User's Guide. Lincoln, NE, 208 pp.
Arjan de Bruijn; Eric J. Gustafson; Daniel M. Kashian; Harmony J. Dalgleish; Brian R. Sturtevant; Douglass F. Jacobs
2014-01-01
Observations of the rapid growth and slow decomposition of American chestnut (Castanea dentata (Marsh.) Borkh.) suggest that its reintroduction could enhance terrestrial carbon (C) sequestration. A suite of decomposition models was fit with decomposition data from coarse woody debris (CWD) sampled in Wisconsin and Virginia, U.S. The optimal (two-...
Designing Test Suites for Software Interactions Testing
2004-01-01
the annual cost of insufficient software testing methods and tools in the United States is between 22.2 to 59.5 billion US dollars [13, 14]. This study...10 (2004), 1–29. [21] Cheng, C., Dumitrescu, A., and Schroeder , P. Generating small com- binatorial test suites to cover input-output relationships... Proceedings of the Conference on the Future of Software Engineering (May 2000), pp. 61 – 72. [51] Hartman, A. Software and hardware testing using
Telescience Resource Kit (TReK)
NASA Technical Reports Server (NTRS)
Lippincott, Jeff
2015-01-01
Telescience Resource Kit (TReK) is one of the Huntsville Operations Support Center (HOSC) remote operations solutions. It can be used to monitor and control International Space Station (ISS) payloads from anywhere in the world. It is comprised of a suite of software applications and libraries that provide generic data system capabilities and access to HOSC services. The TReK Software has been operational since 2000. A new cross-platform version of TReK is under development. The new software is being released in phases during the 2014-2016 timeframe. The TReK Release 3.x series of software is the original TReK software that has been operational since 2000. This software runs on Windows. It contains capabilities to support traditional telemetry and commanding using CCSDS (Consultative Committee for Space Data Systems) packets. The TReK Release 4.x series of software is the new cross platform software. It runs on Windows and Linux. The new TReK software will support communication using standard IP protocols and traditional telemetry and commanding. All the software listed above is compatible and can be installed and run together on Windows. The new TReK software contains a suite of software that can be used by payload developers on the ground and onboard (TReK Toolkit). TReK Toolkit is a suite of lightweight libraries and utility applications for use onboard and on the ground. TReK Desktop is the full suite of TReK software -most useful on the ground. When TReK Desktop is released, the TReK installation program will provide the option to choose just the TReK Toolkit portion of the software or the full TReK Desktop suite. The ISS program is providing the TReK Toolkit software as a generic flight software capability offered as a standard service to payloads. TReK Software Verification was conducted during the April/May 2015 timeframe. Payload teams using the TReK software onboard can reference the TReK software verification. TReK will be demonstrated on-orbit running on an ISS provided T61p laptop. Target Timeframe: September 2015 -2016. The on-orbit demonstration will collect benchmark metrics, and will be used in the future to provide live demonstrations during ISS Payload Conferences. Benchmark metrics and demonstrations will address the protocols described in SSP 52050-0047 Ku Forward section 3.3.7. (Associated term: CCSDS File Delivery Protocol (CFDP)).
NASA Technical Reports Server (NTRS)
Fitz, Rhonda; Whitman, Gerek
2016-01-01
Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IV&V) Program, with Software Assurance Research Program support, extracted FM architectures across the IV&V portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IV&V projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management. The identification of particular FM architectures, visibility, and associated IV&V techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. Additionally, the role FM has with regard to strengthened security requirements, with potential to advance overall asset protection of flight software systems, is being addressed with the development of an adverse conditions database encompassing flight software vulnerabilities. Capitalizing on the established framework, this TR suite provides assurance capability for a variety of FM architectures and varied development approaches. Research results are being disseminated across NASA, other agencies, and the software community. This paper discusses the findings and TR suite informing the FM domain in best practices for FM architectural design, visibility observations, and methods employed for IV&V and mission assurance.
PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR.
Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-Lin; Korbie, Darren; Trau, Matt
2017-01-24
The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com).
NASA Technical Reports Server (NTRS)
Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven
2013-01-01
This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.
Software Suite to Support In-Flight Characterization of Remote Sensing Systems
NASA Technical Reports Server (NTRS)
Stanley, Thomas; Holekamp, Kara; Gasser, Gerald; Tabor, Wes; Vaughan, Ronald; Ryan, Robert; Pagnutti, Mary; Blonski, Slawomir; Kenton, Ross
2014-01-01
A characterization software suite was developed to facilitate NASA's in-flight characterization of commercial remote sensing systems. Characterization of aerial and satellite systems requires knowledge of ground characteristics, or ground truth. This information is typically obtained with instruments taking measurements prior to or during a remote sensing system overpass. Acquired ground-truth data, which can consist of hundreds of measurements with different data formats, must be processed before it can be used in the characterization. Accurate in-flight characterization of remote sensing systems relies on multiple field data acquisitions that are efficiently processed, with minimal error. To address the need for timely, reproducible ground-truth data, a characterization software suite was developed to automate the data processing methods. The characterization software suite is engineering code, requiring some prior knowledge and expertise to run. The suite consists of component scripts for each of the three main in-flight characterization types: radiometric, geometric, and spatial. The component scripts for the radiometric characterization operate primarily by reading the raw data acquired by the field instruments, combining it with other applicable information, and then reducing it to a format that is appropriate for input into MODTRAN (MODerate resolution atmospheric TRANsmission), an Air Force Research Laboratory-developed radiative transport code used to predict at-sensor measurements. The geometric scripts operate by comparing identified target locations from the remote sensing image to known target locations, producing circular error statistics defined by the Federal Geographic Data Committee Standards. The spatial scripts analyze a target edge within the image, and produce estimates of Relative Edge Response and the value of the Modulation Transfer Function at the Nyquist frequency. The software suite enables rapid, efficient, automated processing of ground truth data, which has been used to provide reproducible characterizations on a number of commercial remote sensing systems. Overall, this characterization software suite improves the reliability of ground-truth data processing techniques that are required for remote sensing system in-flight characterizations.
Santín, Cristina; Doerr, Stefan H; Merino, Agustin; Bucheli, Thomas D; Bryant, Rob; Ascough, Philippa; Gao, Xiaodong; Masiello, Caroline A
2017-09-11
Pyrogenic carbon (PyC), produced naturally (wildfire charcoal) and anthropogenically (biochar), is extensively studied due to its importance in several disciplines, including global climate dynamics, agronomy and paleosciences. Charcoal and biochar are commonly used as analogues for each other to infer respective carbon sequestration potentials, production conditions, and environmental roles and fates. The direct comparability of corresponding natural and anthropogenic PyC, however, has never been tested. Here we compared key physicochemical properties (elemental composition, δ 13 C and PAHs signatures, chemical recalcitrance, density and porosity) and carbon sequestration potentials of PyC materials formed from two identical feedstocks (pine forest floor and wood) under wildfire charring- and slow-pyrolysis conditions. Wildfire charcoals were formed under higher maximum temperatures and oxygen availabilities, but much shorter heating durations than slow-pyrolysis biochars, resulting in differing physicochemical properties. These differences are particularly relevant regarding their respective roles as carbon sinks, as even the wildfire charcoals formed at the highest temperatures had lower carbon sequestration potentials than most slow-pyrolysis biochars. Our results challenge the common notion that natural charcoal and biochar are well suited as proxies for each other, and suggest that biochar's environmental residence time may be underestimated when based on natural charcoal as a proxy, and vice versa.
Performance testing of 3D point cloud software
NASA Astrophysics Data System (ADS)
Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.
2013-10-01
LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.
Web-based Tool Suite for Plasmasphere Information Discovery
NASA Astrophysics Data System (ADS)
Newman, T. S.; Wang, C.; Gallagher, D. L.
2005-12-01
A suite of tools that enable discovery of terrestrial plasmasphere characteristics from NASA IMAGE Extreme Ultra Violet (EUV) images is described. The tool suite is web-accessible, allowing easy remote access without the need for any software installation on the user's computer. The features supported by the tool include reconstruction of the plasmasphere plasma density distribution from a short sequence of EUV images, semi-automated selection of the plasmapause boundary in an EUV image, and mapping of the selected boundary to the geomagnetic equatorial plane. EUV image upload and result download is also supported. The tool suite's plasmapause mapping feature is achieved via the Roelof and Skinner (2000) Edge Algorithm. The plasma density reconstruction is achieved through a tomographic technique that exploits physical constraints to allow for a moderate resolution result. The tool suite's software architecture uses Java Server Pages (JSP) and Java Applets on the front side for user-software interaction and Java Servlets on the server side for task execution. The compute-intensive components of the tool suite are implemented in C++ and invoked by the server via Java Native Interface (JNI).
Development of a Carbon Sequestration Visualization Tool using Google Earth Pro
NASA Astrophysics Data System (ADS)
Keating, G. N.; Greene, M. K.
2008-12-01
The Big Sky Carbon Sequestration Partnership seeks to prepare organizations throughout the western United States for a possible carbon-constrained economy. Through the development of CO2 capture and subsurface sequestration technology, the Partnership is working to enable the region to cleanly utilize its abundant fossil energy resources. The intent of the Los Alamos National Laboratory Big Sky Visualization tool is to allow geochemists, geologists, geophysicists, project managers, and other project members to view, identify, and query the data collected from CO2 injection tests using a single data source platform, a mission to which Google Earth Pro is uniquely and ideally suited . The visualization framework enables fusion of data from disparate sources and allows investigators to fully explore spatial and temporal trends in CO2 fate and transport within a reservoir. 3-D subsurface wells are projected above ground in Google Earth as the KML anchor points for the presentation of various surface subsurface data. This solution is the most integrative and cost-effective possible for the variety of users in the Big Sky community.
WinHPC System Software | High-Performance Computing | NREL
Software WinHPC System Software Learn about the software applications, tools, toolchains, and for industrial applications. Intel Compilers Development Tool, Toolchain Suite featuring an industry
Global Combat Support System-Marine Corps Proof-of-Concept for Dashboard Analytics
2014-12-01
The core is modern, commercial-off-the-shelf enterprise resource planning ( ERP ) software (Oracle 11i e-Business Suite). GCSS-MCs design is focused...factor in the decision to implement this new software . GCSS-MC is the technology centerpiece of the Logistics Modernization (LogMod) Program...GCSS-MC is based on the implementation of Oracle e-Business Suite 11i as the core software package. This is the same infrastructure that Oracle
Advanced Spacesuit Informatics Software Design for Power, Avionics and Software Version 2.0
NASA Technical Reports Server (NTRS)
Wright, Theodore W.
2016-01-01
A description of the software design for the 2016 edition of the Informatics computer assembly of the NASAs Advanced Extravehicular Mobility Unit (AEMU), also called the Advanced Spacesuit. The Informatics system is an optional part of the spacesuit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and warning information. It also provides an interface to the suit mounted camera for recording still images, video, and audio field notes.
Simulating Humans as Integral Parts of Spacecraft Missions
NASA Technical Reports Server (NTRS)
Bruins, Anthony C.; Rice, Robert; Nguyen, Lac; Nguyen, Heidi; Saito, Tim; Russell, Elaine
2006-01-01
The Collaborative-Virtual Environment Simulation Tool (C-VEST) software was developed for use in a NASA project entitled "3-D Interactive Digital Virtual Human." The project is oriented toward the use of a comprehensive suite of advanced software tools in computational simulations for the purposes of human-centered design of spacecraft missions and of the spacecraft, space suits, and other equipment to be used on the missions. The C-VEST software affords an unprecedented suite of capabilities for three-dimensional virtual-environment simulations with plug-in interfaces for physiological data, haptic interfaces, plug-and-play software, realtime control, and/or playback control. Mathematical models of the mechanics of the human body and of the aforementioned equipment are implemented in software and integrated to simulate forces exerted on and by astronauts as they work. The computational results can then support the iterative processes of design, building, and testing in applied systems engineering and integration. The results of the simulations provide guidance for devising measures to counteract effects of microgravity on the human body and for the rapid development of virtual (that is, simulated) prototypes of advanced space suits, cockpits, and robots to enhance the productivity, comfort, and safety of astronauts. The unique ability to implement human-in-the-loop immersion also makes the C-VEST software potentially valuable for use in commercial and academic settings beyond the original space-mission setting.
NASA Astrophysics Data System (ADS)
Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.
2011-12-01
In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.
Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi
2014-11-01
Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time.
Rocca-Serra, Philippe; Brandizi, Marco; Maguire, Eamonn; Sklyar, Nataliya; Taylor, Chris; Begley, Kimberly; Field, Dawn; Harris, Stephen; Hide, Winston; Hofmann, Oliver; Neumann, Steffen; Sterk, Peter; Tong, Weida; Sansone, Susanna-Assunta
2010-01-01
Summary: The first open source software suite for experimentalists and curators that (i) assists in the annotation and local management of experimental metadata from high-throughput studies employing one or a combination of omics and other technologies; (ii) empowers users to uptake community-defined checklists and ontologies; and (iii) facilitates submission to international public repositories. Availability and Implementation: Software, documentation, case studies and implementations at http://www.isa-tools.org Contact: isatools@googlegroups.com PMID:20679334
ERIC Educational Resources Information Center
Peterlin, Primoz
2010-01-01
Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…
TypingSuite: Integrated Software for Presenting Stimuli, and Collecting and Analyzing Typing Data
ERIC Educational Resources Information Center
Mazerolle, Erin L.; Marchand, Yannick
2015-01-01
Research into typing patterns has broad applications in both psycholinguistics and biometrics (i.e., improving security of computer access via each user's unique typing patterns). We present a new software package, TypingSuite, which can be used for presenting visual and auditory stimuli, collecting typing data, and summarizing and analyzing the…
MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*
Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying
2016-01-01
Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644
Geochemical Monitoring Considerations for the FutureGen 2.0 Project
Amonette, James E.; Johnson, Timothy A.; Spencer, Clayton F.; ...
2014-12-31
Geochemical monitoring is an essential component of a suite of monitoring technologies designed to evaluate CO2 mass balance and detect possible loss of containment at the FutureGen 2.0 geologic sequestration site near Jacksonville, IL. This presentation gives an overview of the potential geochemical approaches and tracer technologies that were considered, and describes the evaluation process by which the most cost-effective and robust of these were selected for implementation
adwTools Developed: New Bulk Alloy and Surface Analysis Software for the Alloy Design Workbench
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Morse, Jeffrey A.; Noebe, Ronald D.; Abel, Phillip B.
2004-01-01
A suite of atomistic modeling software, called the Alloy Design Workbench, has been developed by the Computational Materials Group at the NASA Glenn Research Center and the Ohio Aerospace Institute (OAI). The main goal of this software is to guide and augment experimental materials research and development efforts by creating powerful, yet intuitive, software that combines a graphical user interface with an operating code suitable for real-time atomistic simulations of multicomponent alloy systems. Targeted for experimentalists, the interface is straightforward and requires minimum knowledge of the underlying theory, allowing researchers to focus on the scientific aspects of the work. The centerpiece of the Alloy Design Workbench suite is the adwTools module, which concentrates on the atomistic analysis of surfaces and bulk alloys containing an arbitrary number of elements. An additional module, adwParams, handles ab initio input for the parameterization used in adwTools. Future modules planned for the suite include adwSeg, which will provide numerical predictions for segregation profiles to alloy surfaces and interfaces, and adwReport, which will serve as a window into the database, providing public access to the parameterization data and a repository where users can submit their own findings from the rest of the suite. The entire suite is designed to run on desktop-scale computers. The adwTools module incorporates a custom OAI/Glenn-developed Fortran code based on the BFS (Bozzolo- Ferrante-Smith) method for alloys, ref. 1). The heart of the suite, this code is used to calculate the energetics of different compositions and configurations of atoms.
Recent developments in the CCP-EM software suite.
Burnley, Tom; Palmer, Colin M; Winn, Martyn
2017-06-01
As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail.
An Open-Source Standard T-Wave Alternans Detector for Benchmarking.
Khaustov, A; Nemati, S; Clifford, Gd
2008-09-14
We describe an open source algorithm suite for T-Wave Alternans (TWA) detection and quantification. The software consists of Matlab implementations of the widely used Spectral Method and Modified Moving Average with libraries to read both WFDB and ASCII data under windows and Linux. The software suite can run in both batch mode and with a provided graphical user interface to aid waveform exploration. Our software suite was calibrated using an open source TWA model, described in a partner paper [1] by Clifford and Sameni. For the PhysioNet/CinC Challenge 2008 we obtained a score of 0.881 for the Spectral Method and 0.400 for the MMA method. However, our objective was not to provide the best TWA detector, but rather a basis for detailed discussion of algorithms.
Recent developments in the CCP-EM software suite
Burnley, Tom
2017-01-01
As part of its remit to provide computational support to the cryo-EM community, the Collaborative Computational Project for Electron cryo-Microscopy (CCP-EM) has produced a software framework which enables easy access to a range of programs and utilities. The resulting software suite incorporates contributions from different collaborators by encapsulating them in Python task wrappers, which are then made accessible via a user-friendly graphical user interface as well as a command-line interface suitable for scripting. The framework includes tools for project and data management. An overview of the design of the framework is given, together with a survey of the functionality at different levels. The current CCP-EM suite has particular strength in the building and refinement of atomic models into cryo-EM reconstructions, which is described in detail. PMID:28580908
Walker, Anthony P.; Zaehle, Sönke; Medlyn, Belinda E.; ...
2015-04-27
Large uncertainty exists in model projections of the land carbon (C) sink response to increasing atmospheric CO 2. Free-Air CO 2 Enrichment (FACE) experiments lasting a decade or more have investigated ecosystem responses to a step change in atmospheric CO 2 concentration. To interpret FACE results in the context of gradual increases in atmospheric CO 2 over decades to centuries, we used a suite of seven models to simulate the Duke and Oak Ridge FACE experiments extended for 300 years of CO 2 enrichment. We also determine key modeling assumptions that drive divergent projections of terrestrial C uptake and evaluatemore » whether these assumptions can be constrained by experimental evidence. All models simulated increased terrestrial C pools resulting from CO 2 enrichment, though there was substantial variability in quasi-equilibrium C sequestration and rates of change. In two of two models that assume that plant nitrogen (N) uptake is solely a function of soil N supply, the net primary production response to elevated CO 2 became progressively N limited. In four of five models that assume that N uptake is a function of both soil N supply and plant N demand, elevated CO 2 led to reduced ecosystem N losses and thus progressively relaxed nitrogen limitation. Many allocation assumptions resulted in increased wood allocation relative to leaves and roots which reduced the vegetation turnover rate and increased C sequestration. Additionally, self-thinning assumptions had a substantial impact on C sequestration in two models. As a result, accurate representation of N process dynamics (in particular N uptake), allocation, and forest self-thinning is key to minimizing uncertainty in projections of future C sequestration in response to elevated atmospheric CO 2.« less
Summary of Documentation for DYNA3D-ParaDyn's Software Quality Assurance Regression Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zywicz, Edward
The Software Quality Assurance (SQA) regression test suite for DYNA3D (Zywicz and Lin, 2015) and ParaDyn (DeGroot, et al., 2015) currently contains approximately 600 problems divided into 21 suites, and is a required component of ParaDyn’s SQA plan (Ferencz and Oliver, 2013). The regression suite allows developers to ensure that software modifications do not unintentionally alter the code response. The entire regression suite is run prior to permanently incorporating any software modification or addition. When code modifications alter test problem results, the specific cause must be determined and fully understood before the software changes and revised test answers can bemore » incorporated. The regression suite is executed on LLNL platforms using a Python script and an associated data file. The user specifies the DYNA3D or ParaDyn executable, number of processors to use, test problems to run, and other options to the script. The data file details how each problem and its answer extraction scripts are executed. For each problem in the regression suite there exists an input deck, an eight-processor partition file, an answer file, and various extraction scripts. These scripts assemble a temporary answer file in a specific format from the simulation results. The temporary and stored answer files are compared to a specific level of numerical precision, and when differences are detected the test problem is flagged as failed. Presently, numerical results are stored and compared to 16 digits. At this accuracy level different processor types, compilers, number of partitions, etc. impact the results to various degrees. Thus, for consistency purposes the regression suite is run with ParaDyn using 8 processors on machines with a specific processor type (currently the Intel Xeon E5530 processor). For non-parallel regression problems, i.e., the two XFEM problems, DYNA3D is used instead. When environments or platforms change, executables using the current source code and the new resource are created and the regression suite is run. If differences in answers arise, the new answers are retained provided that the differences are inconsequential. This bootstrap approach allows the test suite answers to evolve in a controlled manner with a high level of confidence. Developers also run the entire regression suite with (serial) DYNA3D. While these results normally differ from the stored (parallel) answers, abnormal termination or wildly different values are strong indicators of potential issues.« less
NASA Astrophysics Data System (ADS)
Thomson, A. M.; Izaurralde, R. C.; Clarke, L. E.
2006-12-01
Assessing the contribution of terrestrial carbon sequestration to national and international climate change mitigation requires integration across scientific and disciplinary boundaries. In a study for the US Climate Change Technology Program, site based measurements and geographic data were used to develop a three- pool, first-order kinetic model of global agricultural soil carbon (C) stock changes over 14 continental scale regions. This model was then used together with land use scenarios from the MiniCAM integrated assessment model in a global analysis of climate change mitigation options. MiniCAM evaluated mitigation strategies within a set of policy environments aimed at achieving atmospheric CO2 stabilization by 2100 under a suite of technology and development scenarios. Adoption of terrestrial sequestration practices is based on competition for land and economic markets for carbon. In the reference case with no climate policy, conversion of agricultural land from conventional cultivation to no tillage over the next century in the United States results in C sequestration of 7.6 to 59.8 Tg C yr-1, which doubles to 19.0 to 143.4 Tg C yr-1 under the most aggressive climate policy. Globally, with no carbon policy, agricultural C sequestration rates range from 75.2 to 18.2 Tg C yr-1 over the century, with the highest rates occurring in the first fifty years. Under the most aggressive global climate change policy, sequestration in agricultural soils reaches up to 190 Tg C yr-1 in the first 15 years. The contribution of agricultural soil C sequestration is a small fraction of the total global carbon offsets necessary to reach the stabilization targets (9 to 20 Gt C yr-1) by the end of the century. This integrated assessment provides decision makers with science-based estimates of the potential magnitude of terrestrial C sequestration relative to other greenhouse gas mitigation strategies in all sectors of the global economy. It also provides insight into the behavior of terrestrial C mitigation options in the presence and absence of climate change mitigation policies.
The AST3 controlling and operating software suite for automatic sky survey
NASA Astrophysics Data System (ADS)
Hu, Yi; Shang, Zhaohui; Ma, Bin; Hu, Keliang
2016-07-01
We have developed a specialized software package, called ast3suite, to achieve the remote control and automatic sky survey for AST3 (Antarctic Survey Telescope) from scratch. It includes several daemon servers and many basic commands. Each program does only one single task, and they work together to make AST3 a robotic telescope. A survey script calls basic commands to carry out automatic sky survey. Ast3suite was carefully tested in Mohe, China in 2013 and has been used at Dome, Antarctica in 2015 and 2016 with the real hardware for practical sky survey. Both test results and practical using showed that ast3suite had worked very well without any manual auxiliary as we expected.
NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities
NASA Technical Reports Server (NTRS)
Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.
2015-01-01
Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The
NASA Technical Reports Server (NTRS)
Fitz, Rhonda; Whitman, Gerek
2016-01-01
Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IVV) Program, with Software Assurance Research Program support, extracted FM architectures across the IVV portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IVV projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management.
NASA Astrophysics Data System (ADS)
Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur
2015-05-01
Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.
Advanced Extravehicular Mobility Unit Informatics Software Design
NASA Technical Reports Server (NTRS)
Wright, Theodore
2014-01-01
This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.
Performance testing of LiDAR exploitation software
NASA Astrophysics Data System (ADS)
Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.
2013-04-01
Mobile LiDAR systems are being used widely in recent years for many applications in the field of geoscience. One of most important limitations of this technology is the large computational requirements involved in data processing. Several software solutions for data processing are available in the market, but users are often unknown about the methodologies to verify their performance accurately. In this work a methodology for LiDAR software performance testing is presented and six different suites are studied: QT Modeler, AutoCAD Civil 3D, Mars 7, Fledermaus, Carlson and TopoDOT (all of them in x64). Results depict as QTModeler, TopoDOT and AutoCAD Civil 3D allow the loading of large datasets, while Fledermaus, Mars7 and Carlson do not achieve these powerful performance. AutoCAD Civil 3D needs large loading time in comparison with the most powerful softwares such as QTModeler and TopoDOT. Carlson suite depicts the poorest results among all the softwares under study, where point clouds larger than 5 million points cannot be loaded and loading time is very large in comparison with the other suites even for the smaller datasets. AutoCAD Civil 3D, Carlson and TopoDOT show more threads than other softwares like QTModeler, Mars7 and Fledermaus.
A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit
NASA Technical Reports Server (NTRS)
Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.
2016-01-01
Shoulder injury is one of the most severe risks that have the potential to impair crewmembers' performance and health in long duration space flight. Overall, 64% of crewmembers experience shoulder pain after extra-vehicular training in a space suit, and 14% of symptomatic crewmembers require surgical repair (Williams & Johnson, 2003). Suboptimal suit fit, in particular at the shoulder region, has been identified as one of the predominant risk factors. However, traditional suit fit assessments and laser scans represent only a single person's data, and thus may not be generalized across wide variations of body shapes and poses. The aim of this work is to develop a software tool based on a statistical analysis of a large dataset of crewmember body shapes. This tool can accurately predict the skin deformation and shape variations for any body size and shoulder pose for a target population, from which the geometry can be exported and evaluated against suit models in commercial CAD software. A preliminary software tool was developed by statistically analyzing 150 body shapes matched with body dimension ranges specified in the Human-Systems Integration Requirements of NASA ("baseline model"). Further, the baseline model was incorporated with shoulder joint articulation ("articulation model"), using additional subjects scanned in a variety of shoulder poses across a pre-specified range of motion. Scan data was cleaned and aligned using body landmarks. The skin deformation patterns were dimensionally reduced and the co-variation with shoulder angles was analyzed. A software tool is currently in development and will be presented in the final proceeding. This tool would allow suit engineers to parametrically generate body shapes in strategically targeted anthropometry dimensions and shoulder poses. This would also enable virtual fit assessments, with which the contact volume and clearance between the suit and body surface can be predictively quantified at reduced time and cost.
The Software Architecture of the Upgraded ESA DRAMA Software Suite
NASA Astrophysics Data System (ADS)
Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger
2013-08-01
In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on the future development of the GUI framework, where the potential for advancements will be shown.
Applicability of aquifer impact models to support decisions at CO2 sequestration sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keating, Elizabeth; Bacon, Diana; Carroll, Susan
2016-09-01
The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO2 sequestration sites (www.netldoe.gov/nrap). This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014, Dai et al., 2014, Keating et al., 2015). The ROMs reproduce the ensemble behavior of large numbers of simulations and are well-suited to applications that consider a large number of scenarios to understand parametermore » sensitivity and uncertainty on the risk of CO2 leakage to groundwater quality. In this paper, we seek to demonstrate applicability of ROM-based ensemble analysis by considering what types of decisions and aquifer types would benefit from the ROM analysis. We present four hypothetical four examples where applying ROMs, in ensemble mode, could support decisions in the early stages in a geologic CO2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.« less
NASA Astrophysics Data System (ADS)
Johnson, J. W.; Nitao, J. J.; Newmark, R. L.; Kirkendall, B. A.; Nimz, G. J.; Knauss, K. G.; Ziagos, J. P.
2002-05-01
Reducing anthropogenic CO2 emissions ranks high among the grand scientific challenges of this century. In the near-term, significant reductions can only be achieved through innovative sequestration strategies that prevent atmospheric release of large-scale CO2 waste streams. Among such strategies, injection into confined geologic formations represents arguably the most promising alternative; and among potential geologic storage sites, oil reservoirs and saline aquifers represent the most attractive targets. Oil reservoirs offer a unique "win-win" approach because CO2 flooding is an effective technique of enhanced oil recovery (EOR), while saline aquifers offer immense storage capacity and widespread distribution. Although CO2-flood EOR has been widely used in the Permian Basin and elsewhere since the 1980s, the oil industry has just recently become concerned with the significant fraction of injected CO2 that eludes recycling and is therefore sequestered. This "lost" CO2 now has potential economic value in the growing emissions credit market; hence, the industry's emerging interest in recasting CO2 floods as co-optimized EOR/sequestration projects. The world's first saline aquifer storage project was also catalyzed in part by economics: Norway's newly imposed atmospheric emissions tax, which spurred development of Statoil's unique North Sea Sleipner facility in 1996. Successful implementation of geologic sequestration projects hinges on development of advanced predictive models and a diverse set of remote sensing, in situ sampling, and experimental techniques. The models are needed to design and forecast long-term sequestration performance; the monitoring techniques are required to confirm and refine model predictions and to ensure compliance with environmental regulations. We have developed a unique reactive transport modeling capability for predicting sequestration performance in saline aquifers, and used it to simulate CO2 injection at Sleipner; we are now extending this capability to address CO2-flood EOR/sequestration in oil reservoirs. We have also developed a suite of innovative geophysical and geochemical techniques for monitoring sequestration performance in both settings. These include electromagnetic induction imaging and electrical resistance tomography for tracking migration of immiscible CO2, noble gas isotopes for assessing trace CO2 leakage through the cap rock, and integrated geochemical sampling, analytical, and experimental methods for determining sequestration partitioning among solubility and mineral trapping mechanisms. We have proposed to demonstrate feasibility of the co-optimized EOR/sequestration concept and utility of our modeling and monitoring technologies to design and evaluate its implementation by conducting a demonstration project in the Livermore Oil Field. This small, mature, shallow field, located less than a mile east of Lawrence Livermore National Laboratory, is representative of many potential EOR/sequestration sites in California. In approach, this proposed demonstration is analogous to the Weyburn EOR/CO2 monitoring project, to which it will provide an important complement by virtue of its contrasting depth (immiscible versus Weyburn's miscible CO2 flood) and geologic setting (clay-capped sand versus Weyburn's anhydrite-capped carbonate reservoir).
Guner, Huseyin; Close, Patrick L; Cai, Wenxuan; Zhang, Han; Peng, Ying; Gregorich, Zachery R; Ge, Ying
2014-03-01
The rapid advancements in mass spectrometry (MS) instrumentation, particularly in Fourier transform (FT) MS, have made the acquisition of high-resolution and high-accuracy mass measurements routine. However, the software tools for the interpretation of high-resolution MS data are underdeveloped. Although several algorithms for the automatic processing of high-resolution MS data are available, there is still an urgent need for a user-friendly interface with functions that allow users to visualize and validate the computational output. Therefore, we have developed MASH Suite, a user-friendly and versatile software interface for processing high-resolution MS data. MASH Suite contains a wide range of features that allow users to easily navigate through data analysis, visualize complex high-resolution MS data, and manually validate automatically processed results. Furthermore, it provides easy, fast, and reliable interpretation of top-down, middle-down, and bottom-up MS data. MASH Suite is convenient, easily operated, and freely available. It can greatly facilitate the comprehensive interpretation and validation of high-resolution MS data with high accuracy and reliability.
THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE
The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...
The Visible Signature Modelling and Evaluation ToolBox
2008-12-01
Technology Organisation DSTO–TR–2212 ABSTRACT A new software suite, the Visible Signature ToolBox ( VST ), has been developed to model and evaluate the...visible signatures of maritime platforms. The VST is a collection of commercial, off-the-shelf software and DSTO developed pro- grams and procedures. The...suite. The VST can be utilised to model and assess visible signatures of maritime platforms. A number of examples are presented to demonstrate the
A quantitative reconstruction software suite for SPECT imaging
NASA Astrophysics Data System (ADS)
Namías, Mauro; Jeraj, Robert
2017-11-01
Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.
Software Acquisition Improvement in the Aeronautical Systems Center
2008-09-01
software fielded, a variety of different methods were suggested by the interviewees. These included blocks, suites and other tailored processes developed...12 Selection of Research Method ...DoD look to the commercial market to buy tools, methods , environments, and application software, instead of custom-built software (DSB: 1987). These
A Heuristic for Improving Legacy Software Quality during Maintenance: An Empirical Case Study
ERIC Educational Resources Information Center
Sale, Michael John
2017-01-01
Many organizations depend on the functionality of mission-critical legacy software and the continued maintenance of this software is vital. Legacy software is defined here as software that contains no testing suite, is often foreign to the developer performing the maintenance, lacks meaningful documentation, and over time, has become difficult to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R.; Hong, Seokyong; Lee, Sangkeun
2016-06-01
GraphBench is a benchmark suite for graph pattern mining and graph analysis systems. The benchmark suite is a significant addition to conducting apples-apples comparison of graph analysis software (databases, in-memory tools, triple stores, etc.)
Rice (Oryza sativa L) plantation affects the stability of biochar in paddy soil.
Wu, Mengxiong; Feng, Qibo; Sun, Xue; Wang, Hailong; Gielen, Gerty; Wu, Weixiang
2015-05-05
Conversion of rice straw into biochar for soil amendment appears to be a promising method to increase long-term carbon sequestration and reduce greenhouse gas (GHG) emissions. The stability of biochar in paddy soil, which is the major determining factor of carbon sequestration effect, depends mainly on soil properties and plant functions. However, the influence of plants on biochar stability in paddy soil remains unclear. In this study, bulk and surface characteristics of the biochars incubated without rice plants were compared with those incubated with rice plants using a suite of analytical techniques. Results showed that although rice plants had no significant influence on the bulk characteristics and decomposition rates of the biochar, the surface oxidation of biochar particles was enhanced by rice plants. Using (13)C labeling we observed that rice plants could significantly increase carbon incorporation from biochar into soil microbial biomass. About 0.047% of the carbon in biochar was incorporated into the rice plants during the whole rice growing cycle. These results inferred that root exudates and transportation of biochar particles into rice plants might decrease the stability of biochar in paddy soil. Impact of plants should be considered when predicting carbon sequestration potential of biochar in soil systems.
Rice (Oryza sativa L) plantation affects the stability of biochar in paddy soil
Wu, Mengxiong; Feng, Qibo; Sun, Xue; Wang, Hailong; Gielen, Gerty; Wu, Weixiang
2015-01-01
Conversion of rice straw into biochar for soil amendment appears to be a promising method to increase long-term carbon sequestration and reduce greenhouse gas (GHG) emissions. The stability of biochar in paddy soil, which is the major determining factor of carbon sequestration effect, depends mainly on soil properties and plant functions. However, the influence of plants on biochar stability in paddy soil remains unclear. In this study, bulk and surface characteristics of the biochars incubated without rice plants were compared with those incubated with rice plants using a suite of analytical techniques. Results showed that although rice plants had no significant influence on the bulk characteristics and decomposition rates of the biochar, the surface oxidation of biochar particles was enhanced by rice plants. Using 13C labeling we observed that rice plants could significantly increase carbon incorporation from biochar into soil microbial biomass. About 0.047% of the carbon in biochar was incorporated into the rice plants during the whole rice growing cycle. These results inferred that root exudates and transportation of biochar particles into rice plants might decrease the stability of biochar in paddy soil. Impact of plants should be considered when predicting carbon sequestration potential of biochar in soil systems. PMID:25944542
This resource directory offers easier access to the CAMEO suite of software. CAMEO, Computer-Aided Management of Emergency Operations, is system of software applications used to plan for and respond to chemical emergencies.
Ionospheric Mapping Software Ensures Accuracy of Pilots GPS
NASA Technical Reports Server (NTRS)
2015-01-01
IonoSTAGE and SuperTruth software are part of a suite created at the Jet Propulsion Laboratory to enable the Federal Aviation Administration's Wide Area Augmentation System, which provides pinpoint accuracy in aircraft GPS units. The system, used by more than 73,000 planes, facilitates landings under adverse conditions at small airports. In 2013, IonoSTAGE and SuperTruth found their first commercial license when NEC, based in Japan, with US headquarters in Irving, Texas, licensed the entire suite.
2011-05-10
concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility...existing surveillance applications or the SAGES tools may be used en masse for an end–to-end biosurveillance capability. doi:10.1371/journal.pone...health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular
Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.
2015-09-18
The PEST++ Version 3 software suite can be compiled for Microsoft Windows®4 and Linux®5 operating systems; the source code is available in a Microsoft Visual Studio®6 2013 solution; Linux Makefiles are also provided. PEST++ Version 3 continues to build a foundation for an open-source framework capable of producing robust and efficient parameter estimation tools for large environmental models.
Impacts of crop rotations on soil organic carbon sequestration
NASA Astrophysics Data System (ADS)
Gobin, Anne; Vos, Johan; Joris, Ingeborg; Van De Vreken, Philippe
2013-04-01
Agricultural land use and crop rotations can greatly affect the amount of carbon sequestered in the soil. We developed a framework for modelling the impacts of crop rotations on soil carbon sequestration at the field scale with test case Flanders. A crop rotation geo-database was constructed covering 10 years of crop rotation in Flanders using the IACS parcel registration (Integrated Administration and Control System) to elicit the most common crop rotation on major soil types in Flanders. In order to simulate the impact of crop cover on carbon sequestration, the Roth-C model was adapted to Flanders' environment and coupled to common crop rotations extracted from the IACS geodatabases and statistical databases on crop yield. Crop allometric models were used to calculate crop residues from common crops in Flanders and subsequently derive stable organic matter fluxes to the soil (REGSOM). The REGSOM model was coupled to Roth-C model was run for 30 years and for all combinations of seven main arable crops, two common catch crops and two common dosages of organic manure. The common crops are winter wheat, winter barley, sugar beet, potato, grain maize, silage maize and winter rapeseed; the catch crops are yellow mustard and Italian ryegrass; the manure dosages are 35 ton/ha cattle slurry and 22 ton/ha pig slurry. Four common soils were simulated: sand, loam, sandy loam and clay. In total more than 2.4 million simulations were made with monthly output of carbon content for 30 years. Results demonstrate that crop cover dynamics influence carbon sequestration for a very large percentage. For the same rotations carbon sequestration is highest on clay soils and lowest on sandy soils. Crop residues of grain maize and winter wheat followed by catch crops contribute largely to the total carbon sequestered. This implies that agricultural policies that impact on agricultural land management influence soil carbon sequestration for a large percentage. The framework is therefore suited for further scenario analysis and impact assessment in order to support agri-environmental policy decisions.
A tool to include gamma analysis software into a quality assurance program.
Agnew, Christina E; McGarry, Conor K
2016-03-01
To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.
Shellikeri, Sphoorti; Setser, Randolph M; Hwang, Tiffany J; Srinivasan, Abhay; Krishnamurthy, Ganesh; Vatsky, Seth; Girard, Erin; Zhu, Xiaowei; Keller, Marc S; Cahill, Anne Marie
2017-07-01
Navigational software provides real-time fluoroscopic needle guidance for percutaneous procedures in the Interventional Radiology (IR) suite. We describe our experience with navigational software for pediatric percutaneous bone biopsies in the IR suite and compare technical success, diagnostic accuracy, radiation dose and procedure time with that of CT-guided biopsies. Pediatric bone biopsies performed using navigational software (Syngo iGuide, Siemens Healthcare) from 2011 to 2016 were prospectively included and anatomically matched CT-guided bone biopsies from 2008 to 2016 were retrospectively reviewed with institutional review board approval. C-arm CT protocols used for navigational software-assisted cases included institution-developed low-dose (0.1/0.17 μGy/projection), regular-dose (0.36 μGy/projection), or a combination of low-dose/regular-dose protocols. Estimated effective radiation dose and procedure times were compared between software-assisted and CT-guided biopsies. Twenty-six patients (15 male; mean age: 10 years) underwent software-assisted biopsies (15 pelvic, 7 lumbar and 4 lower extremity) and 33 patients (13 male; mean age: 9 years) underwent CT-guided biopsies (22 pelvic, 7 lumbar and 4 lower extremity). Both modality biopsies resulted in a 100% technical success rate. Twenty-five of 26 (96%) software-assisted and 29/33 (88%) CT-guided biopsies were diagnostic. Overall, the effective radiation dose was significantly lower in software-assisted than CT-guided cases (3.0±3.4 vs. 6.6±7.7 mSv, P=0.02). The effective dose difference was most dramatic in software-assisted cases using low-dose C-arm CT (1.2±1.8 vs. 6.6±7.7 mSv, P=0.001) or combined low-dose/regular-dose C-arm CT (1.9±2.4 vs. 6.6±7.7 mSv, P=0.04), whereas effective dose was comparable in software-assisted cases using regular-dose C-arm CT (6.0±3.5 vs. 6.6±7.7 mSv, P=0.7). Mean procedure time was significantly lower for software-assisted cases (91±54 vs. 141±68 min, P=0.005). In our experience, navigational software technology in the IR suite is a promising alternative to CT guidance for pediatric bone biopsies providing comparable technical success and diagnostic accuracy with lower radiation dose and procedure time, in addition to providing real-time fluoroscopic needle guidance.
Flowing Valued Information and Cyber-Physical Situational Awareness
2012-01-01
file type” constraints. The basic software supporting encryption and signing uses the OPENSSL software suite (the November 2009 version is...authorities for each organization can use OPENSSL software to generate their public and private keys. The MBTC does need to know the public or private
Robot-operated quality control station based on the UTT method
NASA Astrophysics Data System (ADS)
Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz; Muszyńska, Magdalena; Nawrocki, Jacek
2017-03-01
This paper presents a robotic test stand for the ultrasonic transmission tomography (UTT) inspection of stator vane thickness. The article presents the method of the test stand design in Autodesk Robot Structural Analysis Professional 2013 software suite. The performance of the designed test stand solution was simulated in the RobotStudio software suite. The operating principle of the test stand measurement system is presented with a specific focus on the measurement strategy. The results of actual wall thickness measurements performed on stator vanes are presented.
The Use of Computer Software to Teach High Technology Skills to Vocational Students.
ERIC Educational Resources Information Center
Farmer, Edgar I.
A study examined the type of computer software that is best suited to teach high technology skills to vocational students. During the study, 50 manufacturers of computer software and hardware were sent questionnaires designed to gather data concerning their recommendations in regard to: software to teach high technology skills to vocational…
Boyero, Luz; Pearson, Richard G; Gessner, Mark O; Barmuta, Leon A; Ferreira, Verónica; Graça, Manuel A S; Dudgeon, David; Boulton, Andrew J; Callisto, Marcos; Chauvet, Eric; Helson, Julie E; Bruder, Andreas; Albariño, Ricardo J; Yule, Catherine M; Arunachalam, Muthukumarasamy; Davies, Judy N; Figueroa, Ricardo; Flecker, Alexander S; Ramírez, Alonso; Death, Russell G; Iwata, Tomoya; Mathooko, Jude M; Mathuriau, Catherine; Gonçalves, José F; Moretti, Marcelo S; Jinggut, Tajang; Lamothe, Sylvain; M'Erimba, Charles; Ratnarajah, Lavenia; Schindler, Markus H; Castela, José; Buria, Leonardo M; Cornejo, Aydeé; Villanueva, Verónica D; West, Derek C
2011-03-01
The decomposition of plant litter is one of the most important ecosystem processes in the biosphere and is particularly sensitive to climate warming. Aquatic ecosystems are well suited to studying warming effects on decomposition because the otherwise confounding influence of moisture is constant. By using a latitudinal temperature gradient in an unprecedented global experiment in streams, we found that climate warming will likely hasten microbial litter decomposition and produce an equivalent decline in detritivore-mediated decomposition rates. As a result, overall decomposition rates should remain unchanged. Nevertheless, the process would be profoundly altered, because the shift in importance from detritivores to microbes in warm climates would likely increase CO(2) production and decrease the generation and sequestration of recalcitrant organic particles. In view of recent estimates showing that inland waters are a significant component of the global carbon cycle, this implies consequences for global biogeochemistry and a possible positive climate feedback. © 2011 Blackwell Publishing Ltd/CNRS.
Software component quality evaluation
NASA Technical Reports Server (NTRS)
Clough, A. J.
1991-01-01
The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.
RSAT 2015: Regulatory Sequence Analysis Tools
Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques
2015-01-01
RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, P.
2014-09-23
GRAPE is a tool for managing software project workflows for the Git version control system. It provides a suite of tools to simplify and configure branch based development, integration with a project's testing suite, and integration with the Atlassian Stash repository hosting tool.
libdrdc: software standards library
NASA Astrophysics Data System (ADS)
Erickson, David; Peng, Tie
2008-04-01
This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.
Semantic Metrics for Analysis of Software
NASA Technical Reports Server (NTRS)
Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara
2005-01-01
A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.
Orientation Effects in Fault Reactivation in Geological CO2 Sequestration
NASA Astrophysics Data System (ADS)
Castelletto, N.; Ferronato, M.; Gambolati, G.; Janna, C.; Teatini, P.
2012-12-01
Geological CO2 sequestration remains one of the most promising option for reducing the greenhouse gases emission. The accurate simulation of the complex coupled physical processes occurring during the injection and the post-injection stage represents a key issue for investigating the feasibility and the safety of the sequestration. The fluid-dynamical and geochemical aspects related to sequestering CO2 underground have been widely debated in the scientific literature over more than one decade. Recently, the importance of geomechanical processes has been widely recognized. In the present modeling study, we focus on fault reactivation induced by injection, an essential aspect for the evaluation of CO2 sequestration projects that needs to be adequately investigated to avoid the generation of preferential leaking path for CO2 and the related risk of induced seismicity. We use a geomechanical model based on the structural equations of poroelasticity solved by the Finite Element (FE) - Interface Element (IE) approach. Standard FEs are used to represent a continuum, while IEs prove especially suited to assess the relative displacements of adjacent elements such as the opening and slippage of existing faults or the generation of new fractures [1]. The IEs allow for the modeling of fault mechanics using an elasto-plastic constitutive law based on the Mohr-Coulomb failure criterion. We analyze the reactivation of a single fault in a synthetic reservoir by varying the fault orientation and size, hydraulic conductivity of the faulted zone, initial vertical and horizontal stress state and Mohr-Coulomb parameters (i.e., friction angle and cohesion). References: [1] Ferronato, M., G. Gambolati, C. Janna, and P. Teatini (2008), Numerical modeling of regional faults in land subsidence prediction above gas/oil reservoirs, Int. J. Numer. Anal. Methods Geomech., 32, 633-657.
Physical properties of sidewall cores from Decatur, Illinois
Morrow, Carolyn A.; Kaven, Joern; Moore, Diane E.; Lockner, David A.
2017-10-18
To better assess the reservoir conditions influencing the induced seismicity hazard near a carbon dioxide sequestration demonstration site in Decatur, Ill., core samples from three deep drill holes were tested to determine a suite of physical properties including bulk density, porosity, permeability, Young’s modulus, Poisson’s ratio, and failure strength. Representative samples of the shale cap rock, the sandstone reservoir, and the Precambrian basement were selected for comparison. Physical properties were strongly dependent on lithology. Bulk density was inversely related to porosity, with the cap rock and basement samples being both least porous (
Rao, Anand B; Rubin, Edward S
2002-10-15
Capture and sequestration of CO2 from fossil fuel power plants is gaining widespread interest as a potential method of controlling greenhouse gas emissions. Performance and cost models of an amine (MEA)-based CO2 absorption system for postcombustion flue gas applications have been developed and integrated with an existing power plant modeling framework that includes multipollutant control technologies for other regulated emissions. The integrated model has been applied to study the feasibility and cost of carbon capture and sequestration at both new and existing coal-burning power plants. The cost of carbon avoidance was shown to depend strongly on assumptions about the reference plant design, details of the CO2 capture system design, interactions with other pollution control systems, and method of CO2 storage. The CO2 avoidance cost for retrofit systems was found to be generally higher than for new plants, mainly because of the higher energy penalty resulting from less efficient heat integration as well as site-specific difficulties typically encountered in retrofit applications. For all cases, a small reduction in CO2 capture cost was afforded by the SO2 emission trading credits generated by amine-based capture systems. Efforts are underway to model a broader suite of carbon capture and sequestration technologies for more comprehensive assessments in the context of multipollutant environmental management.
pcircle - A Suite of Scalable Parallel File System Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
WANG, FEIYI
2015-10-01
Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth; Engel, Dave; Star, Keith
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suitemore » of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.« less
Nema, Vijay; Pal, Sudhir Kumar
2013-01-01
This study was conducted to find the best suited freely available software for modelling of proteins by taking a few sample proteins. The proteins used were small to big in size with available crystal structures for the purpose of benchmarking. Key players like Phyre2, Swiss-Model, CPHmodels-3.0, Homer, (PS)2, (PS)(2)-V(2), Modweb were used for the comparison and model generation. Benchmarking process was done for four proteins, Icl, InhA, and KatG of Mycobacterium tuberculosis and RpoB of Thermus Thermophilus to get the most suited software. Parameters compared during analysis gave relatively better values for Phyre2 and Swiss-Model. This comparative study gave the information that Phyre2 and Swiss-Model make good models of small and large proteins as compared to other screened software. Other software was also good but is often not very efficient in providing full-length and properly folded structure.
Handling Input and Output for COAMPS
NASA Technical Reports Server (NTRS)
Fitzpatrick, Patrick; Tran, Nam; Li, Yongzuo; Anantharaj, Valentine
2007-01-01
Two suites of software have been developed to handle the input and output of the Coupled Ocean Atmosphere Prediction System (COAMPS), which is a regional atmospheric model developed by the Navy for simulating and predicting weather. Typically, the initial and boundary conditions for COAMPS are provided by a flat-file representation of the Navy s global model. Additional algorithms are needed for running the COAMPS software using global models. One of the present suites satisfies this need for running COAMPS using the Global Forecast System (GFS) model of the National Oceanic and Atmospheric Administration. The first step in running COAMPS downloading of GFS data from an Internet file-transfer-protocol (FTP) server computer of the National Centers for Environmental Prediction (NCEP) is performed by one of the programs (SSC-00273) in this suite. The GFS data, which are in gridded binary (GRIB) format, are then changed to a COAMPS-compatible format by another program in the suite (SSC-00278). Once a forecast is complete, still another program in the suite (SSC-00274) sends the output data to a different server computer. The second suite of software (SSC- 00275) addresses the need to ingest up-to-date land-use-and-land-cover (LULC) data into COAMPS for use in specifying typical climatological values of such surface parameters as albedo, aerodynamic roughness, and ground wetness. This suite includes (1) a program to process LULC data derived from observations by the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Terra and Aqua satellites, (2) programs to derive new climatological parameters for the 17-land-use-category MODIS data; and (3) a modified version of a FORTRAN subroutine to be used by COAMPS. The MODIS data files are processed to reformat them into a compressed American Standard Code for Information Interchange (ASCII) format used by COAMPS for efficient processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacon, Diana Holford; Locke II, Randall A.; Keating, Elizabeth
The National Risk Assessment Partnership (NRAP) has developed a suite of tools to assess and manage risk at CO2 sequestration sites (1). The NRAP tool suite includes the Aquifer Impact Model (AIM), based on reduced order models developed using site-specific data from two aquifers (alluvium and carbonate). The models accept aquifer parameters as a range of variable inputs so they may have more broad applicability. Guidelines have been developed for determining the aquifer types for which the ROMs should be applicable. This paper considers the applicability of the aquifer models in AIM to predicting the impact of CO2 or Brinemore » leakage were it to occur at the Illinois Basin Decatur Project (IBDP). Based on the results of the sensitivity analysis, the hydraulic parameters and leakage source term magnitude are more sensitive than clay fraction or cation exchange capacity. Sand permeability was the only hydraulic parameter measured at the IBDP site. More information on the other hydraulic parameters, such as sand fraction and sand/clay correlation lengths, could reduce uncertainty in risk estimates. Some non-adjustable parameters, such as the initial pH and TDS and the pH no-impact threshold, are significantly different for the ROM than for the observations at the IBDP site. The reduced order model could be made more useful to a wider range of sites if the initial conditions and no-impact threshold values were adjustable parameters.« less
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
Scott, S. D.; Mumgaard, R. T.
2016-07-20
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, S. D.; Mumgaard, R. T.
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
RSAT 2015: Regulatory Sequence Analysis Tools.
Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques
2015-07-01
RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Sandra Brown; Ellen Hawes
2002-09-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research projects is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
Rapid Building Assessment Project
2014-05-01
ongoing management of commercial energy efficiency. No other company offers all of these proven services on a seamless, integrated Software -as-a- Service ...FirstFuel has added a suite of additional Software -as-a- Service analytics capabilities to support the entire energy efficiency lifecycle, including...the client side. In this document, we refer to the service side software as “BUILDER” and the client software as “BuilderRED,” following the Army
Recent advances in the CRANK software suite for experimental phasing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pannu, Navraj S., E-mail: raj@chem.leidenuniv.nl; Waterreus, Willem-Jan; Skubák, Pavol
2011-04-01
Recent developments in the CRANK software suite for experimental phasing have led to many more structures being built automatically. For its first release in 2004, CRANK was shown to effectively detect and phase anomalous scatterers from single-wavelength anomalous diffraction data. Since then, CRANK has been significantly improved and many more structures can be built automatically with single- or multiple-wavelength anomalous diffraction or single isomorphous replacement with anomalous scattering data. Here, the new algorithms that have been developed that have led to these substantial improvements are discussed and CRANK’s performance on over 100 real data sets is shown. The latest versionmore » of CRANK is freely available for download at http://www.bfsc.leidenuniv.nl/software/crank/ and from CCP4 (http://www.ccp4.ac.uk/)« less
CFEL-ASG Software Suite (CASS): usage for free-electron laser experiments with biological focus.
Foucar, Lutz
2016-08-01
CASS [Foucar et al. (2012). Comput. Phys. Commun. 183 , 2207-2213] is a well established software suite for experiments performed at any sort of light source. It is based on a modular design and can easily be adapted for use at free-electron laser (FEL) experiments that have a biological focus. This article will list all the additional functionality and enhancements of CASS for use with FEL experiments that have been introduced since the first publication. The article will also highlight some advanced experiments with biological aspects that have been performed.
A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit
NASA Technical Reports Server (NTRS)
Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.
2016-01-01
Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.
Automatic discovery of the communication network topology for building a supercomputer model
NASA Astrophysics Data System (ADS)
Sobolev, Sergey; Stefanov, Konstantin; Voevodin, Vadim
2016-10-01
The Research Computing Center of Lomonosov Moscow State University is developing the Octotron software suite for automatic monitoring and mitigation of emergency situations in supercomputers so as to maximize hardware reliability. The suite is based on a software model of the supercomputer. The model uses a graph to describe the computing system components and their interconnections. One of the most complex components of a supercomputer that needs to be included in the model is its communication network. This work describes the proposed approach for automatically discovering the Ethernet communication network topology in a supercomputer and its description in terms of the Octotron model. This suite automatically detects computing nodes and switches, collects information about them and identifies their interconnections. The application of this approach is demonstrated on the "Lomonosov" and "Lomonosov-2" supercomputers.
Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition
2017-01-01
004 OFFICE OF NAVAL RESEARCH ATTN JASON STACK MINE WARFARE & OCEAN ENGINEERING PROGRAMS CODE 32, SUITE 1092 875 N RANDOLPH ST ARLINGTON VA 22203 ONR...naval mine countermeasures (MCM) operations by automating a large portion of the data analysis. Successful long-term implementation of ATR requires a...Modular Algorithm Testbed Suite; MATS; Mine Countermeasures Operations U U U SAR 24 Derek R. Kolacinski (850) 230-7218 THIS PAGE INTENTIONALLY LEFT
Information Flow Integrity for Systems of Independently-Developed Components
2015-06-22
We also examined three programs (Apache, MySQL , and PHP) in detail to evaluate the efficacy of using the provided package test suites to generate...method are just as effective as hooks that were manually placed over the course of years while greatly reducing the burden on programmers. ”Leveraging...to validate optimizations of real-world, mature applications: the Apache software suite, the Mozilla Suite, and the MySQL database. ”Validating Library
ERIC Educational Resources Information Center
Byrd, Rob
2008-01-01
Is open source business intelligence (OS BI) software ready for prime time? The author thoroughly investigated each of three OS BI toolsets--Pentaho BI Suite, Jaspersoft BI Suite, and Talend Open Studio--by installing the OS BI tools himself, by interviewing technologists at academic institutions who had implemented these OS BI solutions, and by…
Automation of Military Civil Engineering and Site Design Functions: Software Evaluation
1989-09-01
promising advantage over manual methods, USACERL is to evaluate available software to determine which, if any, is best suited to the type of civil...moved. Therefore, original surface data were assembled by scaling the northing and easting distances of field elevations and entering them manually into...in the software or requesting an update or addition to the software or manuals . Responses to forms submitted during the test were received at
Campus Energy Model for Control and Performance Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-09-19
The core of the modeling platform is an extensible block library for the MATLAB/Simulink software suite. The platform enables true co-simulation (interaction at each simulation time step) with NREL's state-of-the-art modeling tools and other energy modeling software.
VISUAL PLUMES MIXING ZONE MODELING SOFTWARE
The U.S. Environmental Protection Agency has a long history of both supporting plume model development and providing mixing zone modeling software. The Visual Plumes model is the most recent addition to the suite of public-domain models available through the EPA-Athens Center f...
The Use of Flexible, Interactive, Situation-Focused Software for the E-Learning of Mathematics.
ERIC Educational Resources Information Center
Farnsworth, Ralph Edward
This paper discusses the classroom, home, and distance use of new, flexible, interactive, application-oriented software known as Active Learning Suite. The actual use of the software, not just a controlled experiment, is reported on. Designed for the e-learning of university mathematics, the program was developed by a joint U.S.-Russia team and…
Improvements to the APBS biomolecular solvation software suite.
Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A
2018-01-01
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.
FEBio: finite elements for biomechanics.
Maas, Steve A; Ellis, Benjamin J; Ateshian, Gerard A; Weiss, Jeffrey A
2012-01-01
In the field of computational biomechanics, investigators have primarily used commercial software that is neither geared toward biological applications nor sufficiently flexible to follow the latest developments in the field. This lack of a tailored software environment has hampered research progress, as well as dissemination of models and results. To address these issues, we developed the FEBio software suite (http://mrl.sci.utah.edu/software/febio), a nonlinear implicit finite element (FE) framework, designed specifically for analysis in computational solid biomechanics. This paper provides an overview of the theoretical basis of FEBio and its main features. FEBio offers modeling scenarios, constitutive models, and boundary conditions, which are relevant to numerous applications in biomechanics. The open-source FEBio software is written in C++, with particular attention to scalar and parallel performance on modern computer architectures. Software verification is a large part of the development and maintenance of FEBio, and to demonstrate the general approach, the description and results of several problems from the FEBio Verification Suite are presented and compared to analytical solutions or results from other established and verified FE codes. An additional simulation is described that illustrates the application of FEBio to a research problem in biomechanics. Together with the pre- and postprocessing software PREVIEW and POSTVIEW, FEBio provides a tailored solution for research and development in computational biomechanics.
Reuse Metrics for Object Oriented Software
NASA Technical Reports Server (NTRS)
Bieman, James M.
1998-01-01
One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.
SNPversity: A web-based tool for visualizing diversity
USDA-ARS?s Scientific Manuscript database
Background: Many stand-alone desktop software suites exist to visualize single nucleotide polymorphisms (SNP) diversity, but web-based software that can be easily implemented and used for biological databases is absent. SNPversity was created to answer this need by building an open-source visualizat...
NIRP Core Software Suite v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitener, Dustin Heath; Folz, Wesley; Vo, Duong
The NIRP Core Software Suite is a core set of code that supports multiple applications. It includes miscellaneous base code for data objects, mathematic equations, and user interface components; and the framework includes several fully-developed software applications that exist as stand-alone tools to compliment other applications. The stand-alone tools are described below. Analyst Manager: An application to manage contact information for people (analysts) that use the software products. This information is often included in generated reports and may be used to identify the owners of calculations. Radionuclide Viewer: An application for viewing the DCFPAK radiological data. Compliments the Mixture Managermore » tool. Mixture Manager: An application to create and manage radionuclides mixtures that are commonly used in other applications. High Explosive Manager: An application to manage explosives and their properties. Chart Viewer: An application to view charts of data (e.g. meteorology charts). Other applications may use this framework to create charts specific to their data needs.« less
Nema, Vijay; Pal, Sudhir Kumar
2013-01-01
Aim: This study was conducted to find the best suited freely available software for modelling of proteins by taking a few sample proteins. The proteins used were small to big in size with available crystal structures for the purpose of benchmarking. Key players like Phyre2, Swiss-Model, CPHmodels-3.0, Homer, (PS)2, (PS)2-V2, Modweb were used for the comparison and model generation. Results: Benchmarking process was done for four proteins, Icl, InhA, and KatG of Mycobacterium tuberculosis and RpoB of Thermus Thermophilus to get the most suited software. Parameters compared during analysis gave relatively better values for Phyre2 and Swiss-Model. Conclusion: This comparative study gave the information that Phyre2 and Swiss-Model make good models of small and large proteins as compared to other screened software. Other software was also good but is often not very efficient in providing full-length and properly folded structure. PMID:24023424
ICC-CLASS: isotopically-coded cleavable crosslinking analysis software suite
2010-01-01
Background Successful application of crosslinking combined with mass spectrometry for studying proteins and protein complexes requires specifically-designed crosslinking reagents, experimental techniques, and data analysis software. Using isotopically-coded ("heavy and light") versions of the crosslinker and cleavable crosslinking reagents is analytically advantageous for mass spectrometric applications and provides a "handle" that can be used to distinguish crosslinked peptides of different types, and to increase the confidence of the identification of the crosslinks. Results Here, we describe a program suite designed for the analysis of mass spectrometric data obtained with isotopically-coded cleavable crosslinkers. The suite contains three programs called: DX, DXDX, and DXMSMS. DX searches the mass spectra for the presence of ion signal doublets resulting from the light and heavy isotopic forms of the isotopically-coded crosslinking reagent used. DXDX searches for possible mass matches between cleaved and uncleaved isotopically-coded crosslinks based on the established chemistry of the cleavage reaction for a given crosslinking reagent. DXMSMS assigns the crosslinks to the known protein sequences, based on the isotopically-coded and un-coded MS/MS fragmentation data of uncleaved and cleaved peptide crosslinks. Conclusion The combination of these three programs, which are tailored to the analytical features of the specific isotopically-coded cleavable crosslinking reagents used, represents a powerful software tool for automated high-accuracy peptide crosslink identification. See: http://www.creativemolecules.com/CM_Software.htm PMID:20109223
Assessment of Suited Reach Envelope in an Underwater Environment
NASA Technical Reports Server (NTRS)
Kim, Han; Benson, Elizabeth; Bernal, Yaritza; Jarvis, Sarah; Meginnis, Ian; Rajulu, Sudhakar
2017-01-01
Predicting the performance of a crewmember in an extravehicular activity (EVA) space suit presents unique challenges. The kinematic patterns of suited motions are difficult to reproduce in gravity. Additionally, 3-D suited kinematics have been practically and technically difficult to quantify in an underwater environment, in which crewmembers are commonly trained and assessed for performance. The goal of this study is to develop a hardware and software system to predictively evaluate the kinematic mobility of suited crewmembers, by measuring the 3-D reach envelope of the suit in an underwater environment. This work is ultimately aimed at developing quantitative metrics to compare the mobility of the existing Extravehicular Mobility Unit (EMU) to newly developed space suit, such as the Z-2. The EMU has been extensively used at NASA since 1981 for EVA outside the Space Shuttle and International Space Station. The Z-2 suit is NASA's newest prototype space suit. The suit is comprised of new upper torso and lower torso architectures, which were designed to improve test subject mobility.
Investigation of an advanced fault tolerant integrated avionics system
NASA Technical Reports Server (NTRS)
Dunn, W. R.; Cottrell, D.; Flanders, J.; Javornik, A.; Rusovick, M.
1986-01-01
Presented is an advanced, fault-tolerant multiprocessor avionics architecture as could be employed in an advanced rotorcraft such as LHX. The processor structure is designed to interface with existing digital avionics systems and concepts including the Army Digital Avionics System (ADAS) cockpit/display system, navaid and communications suites, integrated sensing suite, and the Advanced Digital Optical Control System (ADOCS). The report defines mission, maintenance and safety-of-flight reliability goals as might be expected for an operational LHX aircraft. Based on use of a modular, compact (16-bit) microprocessor card family, results of a preliminary study examining simplex, dual and standby-sparing architectures is presented. Given the stated constraints, it is shown that the dual architecture is best suited to meet reliability goals with minimum hardware and software overhead. The report presents hardware and software design considerations for realizing the architecture including redundancy management requirements and techniques as well as verification and validation needs and methods.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
MARTA: a suite of Java-based tools for assigning taxonomic status to DNA sequences.
Horton, Matthew; Bodenhausen, Natacha; Bergelson, Joy
2010-02-15
We have created a suite of Java-based software to better provide taxonomic assignments to DNA sequences. We anticipate that the program will be useful for protistologists, virologists, mycologists and other microbial ecologists. The program relies on NCBI utilities including the BLAST software and Taxonomy database and is easily manipulated at the command-line to specify a BLAST candidate's query-coverage or percent identity requirements; other options include the ability to set minimal consensus requirements (%) for each of the eight major taxonomic ranks (Domain, Kingdom, Phylum, ...) and whether to consider lower scoring candidates when the top-hit lacks taxonomic classification.
Environmental Health Monitor: Advanced Development of Temperature Sensor Suite.
1995-07-30
systems was implemented using program code existing at Veritay. The software , written in Microsoft® QuickBASIC, facilitated program changes for...currently unforeseen reason re-calibration is needed, this can be readily * accommodated by a straightforward change in the software program---without...unit. A linear relationship between these differences * was obtained using curve fitting software . The ½/-inch globe to 6-inch globe correlation * was
2015-09-30
originate from NASA , NOAA , and community modeling efforts, and support for creation of the suite was shared by sponsors from other agencies. ESPS...Framework (ESMF) Software and Application Development Cecelia Deluca NESII/CIRES/ NOAA Earth System Research Laboratory 325 Broadway Boulder, CO...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The
ERIC Educational Resources Information Center
Kendall, Leslie R.
2013-01-01
Individuals who have Asperger's Syndrome/High-Functioning Autism, as a group, are chronically underemployed and underutilized. Many in this group have abilities that are well suited for various roles within the practice of software development. Multiple studies have shown that certain organizational and management changes in the software…
NASA Astrophysics Data System (ADS)
Welker, J. M.; Sullivan, P.; Rogers, M.; Sharp, E. D.; Sletten, R.; Burnham, J. L.; Hallet, B.; Hagedorn, B.; Czimiczk, C.
2009-12-01
Greenland is experiencing some of the fastest rates of climate warming across the Arctic including warmer summers and increases in snow fall. The effects of these new states of Greenland are however, uncertain especially for carbon, nitrogen and water biogeochemical processes, soil traits, vegetation growth patterns, mineral nutrition and plant ecophysiological processes. Since 2003 we have conducted a suite of observational and experimental measurements that have been designed to understand the fundamental nature of polar desert, polar semi-desert and fen landscapes in NW Greenland. In addition, we have established a suite of experiments to ascertain ecosystem responses to warming at multiple levels (~2030 and 2050), in conjunction with added summer rain; the consequences of added snow fall (ambient, intermediate and deep) and the effects of increases in nutrient additions (added N, P and N+P), which represent extreme warming conditions. We find that: a) the soil C pools are 6-fold larger than previously measured, b) extremely old C (up to ~30k bp) which has been buried by frost cracking and frost heaving is reaching the modern atmosphere, but in only trace amounts as measured by respired 14CO2, c) warming that simulates 2030, has only a small effect on net C sequestration but warming that simulates 2050 when combined with added summer rain, increases C sequestration by 300%, d) increases in N deposition almost immediately and completely changes the vegetation composition of polar semi-deserts shifting the NDVI values from 0.2 to 0.5 within 2 years. Our findings depict a system that is poised to contribute stronger feedbacks than previously expected as climates in NW Greenland change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Sandra Brown; Patrick Gonzalez
2004-07-10
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: remote sensing for carbon analysis; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
NASA Astrophysics Data System (ADS)
Elias, P. O.; Faderin, A.
2014-12-01
Urban trees are a component of the urban infrastructure which offers diverse services including environmental, aesthetic and economic. The accumulation of carbon in the atmosphere resulting from the indiscriminate distribution of human populations and urban activities with the unsustainable consumption of natural resources contributes to global environmental change especially in coastal cities like Lagos. Carbon stocks and sequestration by urban trees are increasingly recognized to play significant roles for mitigating climate change. This paper focuses on the estimation of carbon stock and sequestration through biomass estimation and quantification in Ikeja GRA, Lagos. Ikeja possesses a characteristic feature as a microcosm of Lagos due to the wide range of land uses. A canopy assessment of tree population was carried out using itree canopy software. A GPS survey was used to collect an inventory of all trees showing their location, spatial distribution and other attributes. The analysis of the carbon storage and sequestration potential of both actual and potential tree planting sites involved biomass estimations from tree allometry equations. Trees were identified at species level and measurements of their dendrometric values were recorded and integrated into the GIS database to estimate biomass of trees and carbon storage. The trees in the study area were estimated to have a biomass of 441.9 mg and carbon storage of 221.395 kg/tree. By considering the potential tree planting sites the estimated carbon stored increased to 11,352.73 kg. Carbon sequestration value in the study area was found to be 1.6790 tonnes for the existing trees and 40.707 tonnes for the potential tree planting sites (PTPS). The estimation of carbon storage and sequestration values of trees are important incentives for carbon accounting/footprints and monitoring of climate change mitigation which has implications for evaluation and monitoring of urban ecosystem.
Johnson, Michelle J; Feng, Xin; Johnson, Laura M; Winters, Jack M
2007-03-01
There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy), and a steering wheel platform (TheraDrive) were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horowitz, Scott; Maguire, Jeff; Tabares-Velasco, Paulo Cesar
2016-08-01
This multiphase study involved comprehensive comparative testing of EnergyPlus and SEEM to determine the differences in energy consumption predictions between these two programs and to reconcile prioritized discrepancies through bug fixes, modeling improvements, and/or consistent inputs and assumptions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peles, Slaven
2016-11-06
GridKit is a software development kit for interfacing power systems and power grid application software with high performance computing (HPC) libraries developed at National Labs and academia. It is also intended as interoperability layer between different numerical libraries. GridKit is not a standalone application, but comes with a suite of test examples illustrating possible usage.
The open-source movement: an introduction for forestry professionals
Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove
2005-01-01
In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....
Improving Mathematics Learning of Kindergarten Students through Computer-Assisted Instruction
ERIC Educational Resources Information Center
Foster, Matthew E.; Anthony, Jason L.; Clements, Doug H.; Sarama, Julie; Williams, Jeffrey M.
2016-01-01
This study evaluated the effects of a mathematics software program, the Building Blocks software suite, on young children's mathematics performance. Participants included 247 Kindergartners from 37 classrooms in 9 schools located in low-income communities. Children within classrooms were randomly assigned to receive 21 weeks of computer-assisted…
The 2009 DOD Cost Research Workshop: Acquisition Reform
2010-02-01
2 ACEIT Enhancement, Help-Desk/Training, Consulting DASA-CE–3 Command, Control, Communications, Computers, Intelligence, Surveillance, and...Management Information System (OSMIS) online interactive relational database DASA-CE–2 Title: ACEIT Enhancement, Help-Desk/Training, Consulting Summary...support and training for the Automated Cost estimator Integrated Tools ( ACEIT ) software suite. ACEIT is the Army standard suite of analytical tools for
Exoskeletons, Robots and System Software: Tools for the Warfighter
2012-04-24
Exoskeletons , Robots and System Software: Tools for the Warfighter? Paul Flanagan, Tuesday, April 24, 2012 11:15 am– 12:00 pm 1 “The views...Emerging technologies such as exoskeletons , robots , drones, and the underlying software are and will change the face of the battlefield. Warfighters will...global hub for educating, informing, and connecting Information Age leaders.” What is an exoskeleton ? An exoskeleton is a wearable robot suit that
Software Design Description for the Tidal Open-boundary Prediction System (TOPS)
2010-05-04
Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--10-9209 Approved for public release; distribution is unlimited. Software ...Department of Defense, Washington Headquarters Services , Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite...RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (include area code) b. ABSTRACT c. THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Software Design
Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite
Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.
2012-01-01
Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347
The GENIE Neutrino Monte Carlo Generator: Physics and User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreopoulos, Costas; Barry, Christopher; Dytman, Steve
2015-10-20
GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of itsmore » physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.« less
A Software Suite for Testing SpaceWire Devices and Networks
NASA Astrophysics Data System (ADS)
Mills, Stuart; Parkes, Steve
2015-09-01
SpaceWire is a data-handling network for use on-board spacecraft, which connects together instruments, mass-memory, processors, downlink telemetry, and other on-board sub-systems. SpaceWire is simple to implement and has some specific characteristics that help it support data-handling applications in space: high-speed, low-power, simplicity, relatively low implementation cost, and architectural flexibility making it ideal for many space missions. SpaceWire provides high-speed (2 Mbits/s to 200 Mbits/s), bi-directional, full-duplex data-links, which connect together SpaceWire enabled equipment. Data-handling networks can be built to suit particular applications using point-to-point data-links and routing switches. STAR-Dundee’s STAR-System software stack has been designed to meet the needs of engineers designing and developing SpaceWire networks and devices. This paper describes the aims of the software and how those needs were met.
Grid Stability Awareness System (GSAS) Final Scientific/Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feuerborn, Scott; Ma, Jian; Black, Clifton
The project team developed a software suite named Grid Stability Awareness System (GSAS) for power system near real-time stability monitoring and analysis based on synchrophasor measurement. The software suite consists of five analytical tools: an oscillation monitoring tool, a voltage stability monitoring tool, a transient instability monitoring tool, an angle difference monitoring tool, and an event detection tool. These tools have been integrated into one framework to provide power grid operators with both real-time or near real-time stability status of a power grid and historical information about system stability status. These tools are being considered for real-time use in themore » operation environment.« less
Applicability of aquifer impact models to support decisions at CO 2 sequestration sites
Keating, Elizabeth; Bacon, Diana; Carroll, Susan; ...
2016-07-25
The National Risk Assessment Partnership has developed a suite of tools to assess and manage risk at CO 2 sequestration sites. This capability includes polynomial or look-up table based reduced-order models (ROMs) that predict the impact of CO 2 and brine leaks on overlying aquifers. The development of these computationally-efficient models and the underlying reactive transport simulations they emulate has been documented elsewhere (Carroll et al., 2014a; Carroll et al., 2014b; Dai et al., 2014 ; Keating et al., 2016). Here in this paper, we seek to demonstrate applicability of ROM-based analysis by considering what types of decisions and aquifermore » types would benefit from the ROM analysis. We present four hypothetical examples where applying ROMs, in ensemble mode, could support decisions during a geologic CO 2 sequestration project. These decisions pertain to site selection, site characterization, monitoring network evaluation, and health impacts. In all cases, we consider potential brine/CO 2 leak rates at the base of the aquifer to be uncertain. We show that derived probabilities provide information relevant to the decision at hand. Although the ROMs were developed using site-specific data from two aquifers (High Plains and Edwards), the models accept aquifer characteristics as variable inputs and so they may have more broad applicability. We conclude that pH and TDS predictions are the most transferable to other aquifers based on the analysis of the nine water quality metrics (pH, TDS, 4 trace metals, 3 organic compounds). Guidelines are presented for determining the aquifer types for which the ROMs should be applicable.« less
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
Proposing a Mathematical Software Tool in Physics Secondary Education
ERIC Educational Resources Information Center
Baltzis, Konstantinos B.
2009-01-01
MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…
A UNIMARC Bibliographic Format Database for ABCD
ERIC Educational Resources Information Center
Megnigbeto, Eustache
2012-01-01
Purpose: ABCD is a web-based open and free software suite for library management derived from the UNESCO CDS/ISIS software technology. The first version was launched officially in December 2009 with a MARC 21 bibliographic format database. This paper aims to detail the building of the UNIMARC bibliographic format database for ABCD.…
Journal and Wave Bearing Impedance Calculation Software
NASA Technical Reports Server (NTRS)
Hanford, Amanda; Campbell, Robert
2012-01-01
The wave bearing software suite is a MALTA application that computes bearing properties for user-specified wave bearing conditions, as well as plain journal bearings. Wave bearings are fluid film journal bearings with multi-lobed wave patterns around the circumference of the bearing surface. In this software suite, the dynamic coefficients are outputted in a way for easy implementation in a finite element model used in rotor dynamics analysis. The software has a graphical user interface (GUI) for inputting bearing geometry parameters, and uses MATLAB s structure interface for ease of interpreting data. This innovation was developed to provide the stiffness and damping components of wave bearing impedances. The computational method for computing bearing coefficients was originally designed for plain journal bearings and tilting pad bearings. Modifications to include a wave bearing profile consisted of changing the film thickness profile given by an equation, and writing an algorithm to locate the integration limits for each fluid region. Careful consideration was needed to implement the correct integration limits while computing the dynamic coefficients, depending on the form of the input/output variables specified in the algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandoval, D. M.; Strittmatter, R. B.; Abeyta, J. D.
2004-01-01
The initial objectives of this effort were to provide a hardware and software platform that can address the requirements for the accountability of classified removable electronic media and vault access logging. The Media Tracker system software assists classified media custodian in managing vault access logging and Media Tracking to prevent the inadvertent violation of rules or policies for the access to a restricted area and the movement and use of tracked items. The MediaTracker system includes the software tools to track and account for high consequence security assets and high value items. The overall benefits include: (1) real-time access tomore » the disposition of all Classified Removable Electronic Media (CREM), (2) streamlined security procedures and requirements, (3) removal of ambiguity and managerial inconsistencies, (4) prevention of incidents that can and should be prevented, (5) alignment with the DOE's initiative to achieve improvements in security and facility operations through technology deployment, and (6) enhanced individual responsibility by providing a consistent method of dealing with daily responsibilities. In response to initiatives to enhance the control of classified removable electronic media (CREM), the Media Tracker software suite was developed, piloted and implemented at the Los Alamos National Laboratory beginning in July 2000. The Media Tracker software suite assists in the accountability and tracking of CREM and other high-value assets. One component of the MediaTracker software suite provides a Laboratory-approved media tracking system. Using commercial touch screen and bar code technology, the MediaTracker (MT) component of the MediaTracker software suite provides an efficient and effective means to meet current Laboratory requirements and provides new-engineered controls to help assure compliance with those requirements. It also establishes a computer infrastructure at vault entrances for vault access logging, and can accommodate several methods of positive identification including smart cards and biometrics. Currently, we have three mechanisms that provide added security for accountability and tracking purposes. One mechanism consists of a portable, hand-held inventory scanner, which allows the custodian to physically track the items that are not accessible within a particular area. The second mechanism is a radio frequency identification (RFID) consisting of a monitoring portal, which tracks and logs in a database all activity tagged of items that pass through the portals. The third mechanism consists of an electronic tagging of a flash memory device for automated inventory of CREM in storage. By modifying this USB device the user is provided with added assurance, limiting the data from being obtained from any other computer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, John
2014-11-29
This project was a computer modeling effort to couple reservoir simulation and ED/RSM using Sensitivity Analysis, Uncertainty Analysis, and Optimization Methods, to assess geologic, geochemical, geomechanical, and rock-fluid effects and factors on CO 2 injectivity, capacity, and plume migration. The project objective was to develop proxy models to simplify the highly complex coupled geochemical and geomechanical models in the utilization and storage of CO 2 in the subsurface. The goals were to investigate and prove the feasibility of the ED/RSM processes and engineering development, and bridge the gaps regarding the uncertainty and unknowns of the many geochemical and geomechanical interactingmore » parameters in the development and operation of anthropogenic CO 2 sequestration and storage sites. The bottleneck in this workflow is the high computational effort of reactive transport simulation models and large number of input variables to optimize with ED/RSM techniques. The project was not to develop the reactive transport, geomechanical, or ED/RSM software, but was to use what was commercially and/or publically available as a proof of concept to generate proxy or surrogate models. A detailed geologic and petrographic mineral assemblage and geologic structure of the doubly plunging anticline was defined using the USDOE RMOTC formations of interest data (e.g., Lower Sundance, Crow Mountain, Alcova Limestone, and Red Peak). The assemblage of 23 minerals was primarily developed from literature data and petrophysical (well log) analysis. The assemblage and structure was input into a commercial reactive transport simulator to predict the effects of CO 2 injection and complex reactions with the reservoir rock. Significant impediments were encountered during the execution phase of the project. The only known commercial reactive transport simulator was incapable of simulating complex geochemistry modeled in this project. Significant effort and project funding was expended to determine the limitations of both the commercial simulator and the Lawrence Berkeley National Laboratory (LBNL) R&D simulator, TOUGHREACT available to the project. A simplified layer cake model approximating the volume of the RMOTC targeted reservoirs was defined with 1-3 minerals eventually modeled with limited success. Modeling reactive transport in porous media requires significant computational power. In this project, up to 24 processors were used to model a limited mineral set of 1-3 minerals. In addition, geomechanical aspects of injecting CO 2 into closed, semi-open, and open systems in various well completion methods was simulated. Enhanced Oil Recovery (EOR) as a storage method was not modeled. A robust and stable simulation dataset or base case was developed and used to create a master dataset with embedded instructions for input to the ED/RSM software. Little success was achieved toward the objective of the project using the commercial simulator or the LBNL simulator versions available during the time of this project. Several hundred realizations were run with the commercial simulator and ED/RSM software, most having convergence problems and terminating prematurely. A proxy model for full field CO 2 injection sequestration utilization and storage was not capable of being developed with software available for this project. Though the chemistry is reasonably known and understood, based on the amount of effort and huge computational time required, predicting CO 2 sequestration storage capacity in geologic formations to within the program goals of ±30% proved unsuccessful.« less
2013-01-01
Chemical cross-linking of proteins combined with mass spectrometry provides an attractive and novel method for the analysis of native protein structures and protein complexes. Analysis of the data however is complex. Only a small number of cross-linked peptides are produced during sample preparation and must be identified against a background of more abundant native peptides. To facilitate the search and identification of cross-linked peptides, we have developed a novel software suite, named Hekate. Hekate is a suite of tools that address the challenges involved in analyzing protein cross-linking experiments when combined with mass spectrometry. The software is an integrated pipeline for the automation of the data analysis workflow and provides a novel scoring system based on principles of linear peptide analysis. In addition, it provides a tool for the visualization of identified cross-links using three-dimensional models, which is particularly useful when combining chemical cross-linking with other structural techniques. Hekate was validated by the comparative analysis of cytochrome c (bovine heart) against previously reported data.1 Further validation was carried out on known structural elements of DNA polymerase III, the catalytic α-subunit of the Escherichia coli DNA replisome along with new insight into the previously uncharacterized C-terminal domain of the protein. PMID:24010795
Razick, Sabry; Močnik, Rok; Thomas, Laurent F.; Ryeng, Einar; Drabløs, Finn; Sætrom, Pål
2014-01-01
Systematic data management and controlled data sharing aim at increasing reproducibility, reducing redundancy in work, and providing a way to efficiently locate complementing or contradicting information. One method of achieving this is collecting data in a central repository or in a location that is part of a federated system and providing interfaces to the data. However, certain data, such as data from biobanks or clinical studies, may, for legal and privacy reasons, often not be stored in public repositories. Instead, we describe a metadata cataloguing system and a software suite for reporting the presence of data from the life sciences domain. The system stores three types of metadata: file information, file provenance and data lineage, and content descriptions. Our software suite includes both graphical and command line interfaces that allow users to report and tag files with these different metadata types. Importantly, the files remain in their original locations with their existing access-control mechanisms in place, while our system provides descriptions of their contents and relationships. Our system and software suite thereby provide a common framework for cataloguing and sharing both public and private data. Database URL: http://bigr.medisin.ntnu.no/data/eGenVar/ PMID:24682735
An advanced software suite for the processing and analysis of silicon luminescence images
NASA Astrophysics Data System (ADS)
Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.
2017-06-01
Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.
Johnson, Michelle J; Feng, Xin; Johnson, Laura M; Winters, Jack M
2007-01-01
Background There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Methods Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy), and a steering wheel platform (TheraDrive) were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Results Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. Conclusion The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home. PMID:17331243
NASA Technical Reports Server (NTRS)
Ross, Amy
2011-01-01
A NASA spacesuit under the EVA Technology Domain consists of a suit system; a PLSS; and a Power, Avionics, and Software (PAS) system. Ross described the basic functions, components, and interfaces of the PLSS, which consists of oxygen, ventilation, and thermal control subsystems; electronics; and interfaces. Design challenges were reviewed from a packaging perspective. Ross also discussed the development of the PLSS over the last two decades.
UFMulti: A new parallel processing software system for HEP
NASA Astrophysics Data System (ADS)
Avery, Paul; White, Andrew
1989-12-01
UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.
The GenABEL Project for statistical genomics.
Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.
Computerized Placement Management Software (CPMS): User Manual, Version 3.0.
ERIC Educational Resources Information Center
College Entrance Examination Board, Princeton, NJ.
This guide is designed to enable the beginner, as well as the advanced user, to understand and use the Computerized Placement Management Software (CPMS). The CPMS is a system for evaluating information about students and recommending their placement into courses best suited for them. It also tracks their progress and maintains their records. The…
Arduino-Based Data Acquisition into Excel, LabVIEW, and MATLAB
ERIC Educational Resources Information Center
Nichols, Daniel
2017-01-01
Data acquisition equipment for physics can be quite expensive. As an alternative, data can be acquired using a low-cost Arduino microcontroller. The Arduino has been used in physics labs where the data are acquired using the Arduino software. The Arduino software, however, does not contain a suite of tools for data fitting and analysis. The data…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2005-10-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2006-01-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)
NASA Technical Reports Server (NTRS)
Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.
2003-01-01
A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.
Design Optimization Toolkit: Users' Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro
The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less
The Effects of Development Team Skill on Software Product Quality
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.
2006-01-01
This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics
Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.
2011-01-01
Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations. PMID:21572957
Vernick, Kenneth D.
2017-01-01
Metavisitor is a software package that allows biologists and clinicians without specialized bioinformatics expertise to detect and assemble viral genomes from deep sequence datasets. The package is composed of a set of modular bioinformatic tools and workflows that are implemented in the Galaxy framework. Using the graphical Galaxy workflow editor, users with minimal computational skills can use existing Metavisitor workflows or adapt them to suit specific needs by adding or modifying analysis modules. Metavisitor works with DNA, RNA or small RNA sequencing data over a range of read lengths and can use a combination of de novo and guided approaches to assemble genomes from sequencing reads. We show that the software has the potential for quick diagnosis as well as discovery of viruses from a vast array of organisms. Importantly, we provide here executable Metavisitor use cases, which increase the accessibility and transparency of the software, ultimately enabling biologists or clinicians to focus on biological or medical questions. PMID:28045932
An experimental microcomputer controlled system for synchronized pulsating anti-gravity suit.
Moore, T W; Foley, J; Reddy, B R; Kepics, F; Jaron, D
1987-07-01
An experimental system to deliver synchronized external pressure pulsations to the lower body is described in this technical note. The system is designed using a microcomputer with a real time interface and an electro-pneumatic subsystem capable of delivering pressure pulses to a modified anti-G suit at a fast rate. It is versatile, containing many options for synchronizing, phasing and sequencing of the pressure pulsations and controlling the pressure level in the suit bladders. Details of its software and hardware are described along with the results of initial testing in a Dynamic Flight Simulator on human volunteers.
NASA Astrophysics Data System (ADS)
Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.
2015-12-01
The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application programming interface (API), which will allow other organizations to build their own custom applications and tools. New features such as finer scale aggregation and an online carbon calculator are being added to the LandCarbon web application to continue to make the site interactive, visually compelling, and useful for a wide range of users.
Technical Performance Assessment: Mission Success in Software Acquisition Management
2010-04-27
Examples Design constraints make software acquisition and development t l iti lex reme y cr ca Application domain – Operational Flight Program, Air...environment – used to produce the software Ri k t t bli h d d i t i d i k ts managemen – es a s e an ma n a ne r s managemen systems Milestone reviews...Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that
Software Quality Assurance and Controls Standard
2010-04-27
Software Quality Assurance d C t l St d dan on ro s an ar Sue Carroll Principal Software Quality Analyst, SAS John Wal z VP Technology and...for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...Cycle (SLC) process? • What is in a SQA Process? • Where are SQA Controls? • What is the SQA standards history? Wh t i h i i SQA?• a s c ang ng n
Hackley, Paul C.; Kolak, Jonathan J.
2008-01-01
This report presents vitrinite reflectance and detailed organic composition data for nine high volatile bituminous coal samples. These samples were selected to provide a single, internally consistent set of reflectance and composition analyses to facilitate the study of linkages among coal composition, bitumen generation during thermal maturation, and geochemical characteristics of generated hydrocarbons. Understanding these linkages is important for addressing several issues, including: the role of coal as a source rock within a petroleum system, the potential for conversion of coal resources to liquid hydrocarbon fuels, and the interactions between coal and carbon dioxide during enhanced coalbed methane recovery and(or) carbon dioxide sequestration in coal beds.
Advanced Computational Models for Fabric-Reinforced Composites
2001-10-01
composites. Trans-Science Corporation 3655 Nobel Drive Suite 440 San Diego, CA 92122-1005 Tel (858) 459-1240 http://www.compositesolutionsinc.com...also based in XP! Material Suppliers San Diego, recently developed the only Newsletters comprehensive design software for the seismic NDT, NDE , NDI...composite bus. Trans-Science Corporation 3655 Nobel Drive Suite 440 San Diego, CA 92122-1005 Tel (858) 459-1240 Fax (858) 459-0210 •’(S-HOME SERVICES
Benchmarking hypercube hardware and software
NASA Technical Reports Server (NTRS)
Grunwald, Dirk C.; Reed, Daniel A.
1986-01-01
It was long a truism in computer systems design that balanced systems achieve the best performance. Message passing parallel processors are no different. To quantify the balance of a hypercube design, an experimental methodology was developed and the associated suite of benchmarks was applied to several existing hypercubes. The benchmark suite includes tests of both processor speed in the absence of internode communication and message transmission speed as a function of communication patterns.
BuddySuite: Command-Line Toolkits for Manipulating Sequences, Alignments, and Phylogenetic Trees.
Bond, Stephen R; Keat, Karl E; Barreira, Sofia N; Baxevanis, Andreas D
2017-06-01
The ability to manipulate sequence, alignment, and phylogenetic tree files has become an increasingly important skill in the life sciences, whether to generate summary information or to prepare data for further downstream analysis. The command line can be an extremely powerful environment for interacting with these resources, but only if the user has the appropriate general-purpose tools on hand. BuddySuite is a collection of four independent yet interrelated command-line toolkits that facilitate each step in the workflow of sequence discovery, curation, alignment, and phylogenetic reconstruction. Most common sequence, alignment, and tree file formats are automatically detected and parsed, and over 100 tools have been implemented for manipulating these data. The project has been engineered to easily accommodate the addition of new tools, is written in the popular programming language Python, and is hosted on the Python Package Index and GitHub to maximize accessibility. Documentation for each BuddySuite tool, including usage examples, is available at http://tiny.cc/buddysuite_wiki. All software is open source and freely available through http://research.nhgri.nih.gov/software/BuddySuite. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution 2017. This work is written by US Government employees and is in the public domain in the US.
Diagnostic and Prognostic Models for Generator Step-Up Transformers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vivek Agarwal; Nancy J. Lybeck; Binh T. Pham
In 2014, the online monitoring (OLM) of active components project under the Light Water Reactor Sustainability program at Idaho National Laboratory (INL) focused on diagnostic and prognostic capabilities for generator step-up transformers. INL worked with subject matter experts from the Electric Power Research Institute (EPRI) to augment and revise the GSU fault signatures previously implemented in the Electric Power Research Institute’s (EPRI’s) Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. Two prognostic models were identified and implemented for GSUs in the FW-PHM Suite software. INL and EPRI demonstrated the use of prognostic capabilities for GSUs. The complete set of faultmore » signatures developed for GSUs in the Asset Fault Signature Database of the FW-PHM Suite for GSUs is presented in this report. Two prognostic models are described for paper insulation: the Chendong model for degree of polymerization, and an IEEE model that uses a loading profile to calculates life consumption based on hot spot winding temperatures. Both models are life consumption models, which are examples of type II prognostic models. Use of the models in the FW-PHM Suite was successfully demonstrated at the 2014 August Utility Working Group Meeting, Idaho Falls, Idaho, to representatives from different utilities, EPRI, and the Halden Research Project.« less
Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi
2014-05-01
Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the `diffraction before destruction' scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles.
Sekiguchi, Yuki; Oroguchi, Tomotaka; Takayama, Yuki; Nakasako, Masayoshi
2014-01-01
Coherent X-ray diffraction imaging is a promising technique for visualizing the structures of non-crystalline particles with dimensions of micrometers to sub-micrometers. Recently, X-ray free-electron laser sources have enabled efficient experiments in the ‘diffraction before destruction’ scheme. Diffraction experiments have been conducted at SPring-8 Angstrom Compact free-electron LAser (SACLA) using the custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors. In the experiments, ten thousands of single-shot diffraction patterns can be collected within several hours. Then, diffraction patterns with significant levels of intensity suitable for structural analysis must be found, direct-beam positions in diffraction patterns determined, diffraction patterns from the two CCD detectors merged, and phase-retrieval calculations for structural analyses performed. A software suite named SITENNO has been developed to semi-automatically apply the four-step processing to a huge number of diffraction data. Here, details of the algorithm used in the suite are described and the performance for approximately 9000 diffraction patterns collected from cuboid-shaped copper oxide particles reported. Using the SITENNO suite, it is possible to conduct experiments with data processing immediately after the data collection, and to characterize the size distribution and internal structures of the non-crystalline particles. PMID:24763651
Sequestration Options for the West Coast States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myer, Larry
The West Coast Regional Carbon Sequestration Partnership (WESTCARB) is one of seven partnerships that have been established by the U.S. Department of Energy (DOE) to evaluate carbon capture and sequestration (CCS) technologies best suited for different regions of the country. The West Coast Region comprises Arizona, California, Nevada, Oregon, Washington, Alaska, and British Columbia. Led by the California Energy Commission, WESTCARB is a consortium of about 70 organizations, including state natural resource and environmental protection agencies; national laboratories and universities; private companies working on carbon dioxide (CO{sub 2}) capture, transportation, and storage technologies; utilities; oil and gas companies; nonprofit organizations; and policy/governance coordinating organizations. Both terrestrial and geologic sequestration options were evaluated in the Region during the 18-month Phase I project. A centralized Geographic Information System (GIS) database of stationary source, geologic and terrestrial sink data was developed. The GIS layer of source locations was attributed with CO{sub 2} emissions and other data and a spreadsheet was developed to estimate capture costs for the sources in the region. Phase I characterization of regional geological sinks shows that geologic storage opportunities exist in the WESTCARB region in each of the major technology areas: saline formations, oil and gas reservoirs, and coal beds. California offers outstanding sequestration opportunities because of its large capacity and the potential of value-added benefits from enhanced oil recovery (EOR) and enhanced gas recovery. The estimate for storage capacity of saline formations in the ten largest basins in California ranges from about 150 to about 500 Gt of CO{sub 2}, the potential CO{sub 2}-EOR storage was estimated to be 3.4 Gt, and the cumulative production from gas reservoirs suggests a CO{sub 2} storage capacity of 1.7 Gt. A GIS-based method for source-sink matching was implemented and preliminary marginal cost curves developed, which showed that 20, 40, or 80 Mega tonnes (Mt) of CO{sub 2} per year could be sequestered in California at a cost ofmore » $31/tonne (t), $35/t, or $$50/t, respectively. Phase I also addressed key issues affecting deployment of CCS technologies, including storage-site monitoring, injection regulations, and health and environmental risks. A framework for screening and ranking candidate sites for geologic CO{sub 2} storage on the basis of HSE risk was developed. A webbased, state-by-state compilation of current regulations for injection wells, and permits/contracts for land use changes, was developed, and modeling studies were carried out to assess the application of a number of different geophysical techniques for monitoring geologic sequestration. Public outreach activities resulted in heightened awareness of sequestration among state, community and industry leaders in the Region. Assessment of the changes in carbon stocks in agricultural lands showed that Washington, Oregon and Arizona were CO{sub 2} sources for the period from 1987 to 1997. Over the same period, forest carbon stocks decreased in Washington, but increased in Oregon and Arizona. Results of the terrestrial supply curve analyses showed that afforestation of rangelands and crop lands offer major sequestration opportunities; at a price of $$20 per t CO{sub 2}, more than 1,233 MMT could be sequestered over 40-years in Washington and more than 1,813 MMT could be sequestered in Oregon.« less
ERIC Educational Resources Information Center
Weston, Mark E.; Bain, Alan
2015-01-01
This study reports findings from a matched-comparison, repeated-measure for intact groups design of the mediating effect of a suite of software on the quality of classroom instruction provided to students by teachers. The quality of instruction provided by teachers in the treatment and control groups was documented via observations that were…
Diagnostics Tools Identify Faults Prior to Failure
NASA Technical Reports Server (NTRS)
2013-01-01
Through the SBIR program, Rochester, New York-based Impact Technologies LLC collaborated with Ames Research Center to commercialize the Center s Hybrid Diagnostic Engine, or HyDE, software. The fault detecting program is now incorporated into a software suite that identifies potential faults early in the design phase of systems ranging from printers to vehicles and robots, saving time and money.
NASA Astrophysics Data System (ADS)
Möller, Thomas; Bellin, Knut; Creutzburg, Reiner
2015-03-01
The aim of this paper is to show the recent progress in the design and prototypical development of a software suite Copra Breeder* for semi-automatic generation of test methodologies and security checklists for IT vulnerability assessment in small and medium-sized enterprises.
OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments
NASA Astrophysics Data System (ADS)
Rebuffi, Luca; Sanchez del Rio, Manuel
2017-08-01
The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.
Multicriteria analysis of ontologically represented information
NASA Astrophysics Data System (ADS)
Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.
2014-11-01
Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.
SSAGES: Software Suite for Advanced General Ensemble Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniquesmore » as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.« less
SSAGES: Software Suite for Advanced General Ensemble Simulations.
Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J
2018-01-28
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.
SSAGES: Software Suite for Advanced General Ensemble Simulations
NASA Astrophysics Data System (ADS)
Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.
2018-01-01
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.
dfnWorks: A discrete fracture network framework for modeling subsurface flow and transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyman, Jeffrey D.; Karra, Satish; Makedonska, Nataliia
DFNWORKS is a parallelized computational suite to generate three-dimensional discrete fracture networks (DFN) and simulate flow and transport. Developed at Los Alamos National Laboratory over the past five years, it has been used to study flow and transport in fractured media at scales ranging from millimeters to kilometers. The networks are created and meshed using DFNGEN, which combines FRAM (the feature rejection algorithm for meshing) methodology to stochastically generate three-dimensional DFNs with the LaGriT meshing toolbox to create a high-quality computational mesh representation. The representation produces a conforming Delaunay triangulation suitable for high performance computing finite volume solvers in anmore » intrinsically parallel fashion. Flow through the network is simulated in dfnFlow, which utilizes the massively parallel subsurface flow and reactive transport finite volume code PFLOTRAN. A Lagrangian approach to simulating transport through the DFN is adopted within DFNTRANS to determine pathlines and solute transport through the DFN. Example applications of this suite in the areas of nuclear waste repository science, hydraulic fracturing and CO 2 sequestration are also included.« less
dfnWorks: A discrete fracture network framework for modeling subsurface flow and transport
Hyman, Jeffrey D.; Karra, Satish; Makedonska, Nataliia; ...
2015-11-01
DFNWORKS is a parallelized computational suite to generate three-dimensional discrete fracture networks (DFN) and simulate flow and transport. Developed at Los Alamos National Laboratory over the past five years, it has been used to study flow and transport in fractured media at scales ranging from millimeters to kilometers. The networks are created and meshed using DFNGEN, which combines FRAM (the feature rejection algorithm for meshing) methodology to stochastically generate three-dimensional DFNs with the LaGriT meshing toolbox to create a high-quality computational mesh representation. The representation produces a conforming Delaunay triangulation suitable for high performance computing finite volume solvers in anmore » intrinsically parallel fashion. Flow through the network is simulated in dfnFlow, which utilizes the massively parallel subsurface flow and reactive transport finite volume code PFLOTRAN. A Lagrangian approach to simulating transport through the DFN is adopted within DFNTRANS to determine pathlines and solute transport through the DFN. Example applications of this suite in the areas of nuclear waste repository science, hydraulic fracturing and CO 2 sequestration are also included.« less
Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L
2018-01-01
Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922
Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L
2018-04-13
Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.
Practical Issues in Implementing Software Reliability Measurement
NASA Technical Reports Server (NTRS)
Nikora, Allen P.; Schneidewind, Norman F.; Everett, William W.; Munson, John C.; Vouk, Mladen A.; Musa, John D.
1999-01-01
Many ways of estimating software systems' reliability, or reliability-related quantities, have been developed over the past several years. Of particular interest are methods that can be used to estimate a software system's fault content prior to test, or to discriminate between components that are fault-prone and those that are not. The results of these methods can be used to: 1) More accurately focus scarce fault identification resources on those portions of a software system most in need of it. 2) Estimate and forecast the risk of exposure to residual faults in a software system during operation, and develop risk and safety criteria to guide the release of a software system to fielded use. 3) Estimate the efficiency of test suites in detecting residual faults. 4) Estimate the stability of the software maintenance process.
NASA Technical Reports Server (NTRS)
Kuznetz, Lawrence; Nguen, Dan; Jones, Jeffrey; Lee, Pascal; Merrell, Ronald; Rafiq, Azhar
2008-01-01
Initial planetary explorations with the Apollo program had a veritable ground support army monitoring the safety and health of the 12 astronauts who performed lunar surface extravehicular activities (EVAs). Given the distances involved, this will not be possible on Mars. A spacesuit for Mars must be smart enough to replace that army. The next generation suits can do so using 2 software systems serving as virtual companions, LEGACI (Life support, Exploration Guidance Algorithm and Consumable Interrogator) and VIOLET (Voice Initiated Operator for Life support and Exploration Tracking). The system presented in this study integrates data inputs from a suite of sensors into the MIII suit s communications, avionics and informatics hardware for distribution to remote managers and data analysis. If successful, the system has application not only for Mars but for nearer term missions to the Moon, and the next generation suits used on ISS as well. Field tests are conducted to assess capabilities for next generation spacesuits at Johnson Space Center (JSC) as well as the Mars and Lunar analog (Devon Island, Canada). LEGACI integrates data inputs from a suite of noninvasive biosensors in the suit and the astronaut (heart rate, suit inlet/outlet lcg temperature and flowrate, suit outlet gas and dewpoint temperature, pCO2, suit O2 pressure, state vector (accelerometry) and others). In the Integrated Walkback Suit Tests held at NASA-JSC and the HMP tests at Devon Island, communication and informatics capabilities were tested (including routing by satellite from the suit at Devon Island to JSC in Houston via secure servers at VCU in Richmond, VA). Results. The input from all the sensors enable LEGACI to compute multiple independent assessments of metabolic rate, from which a "best" met rate is chosen based on statistical methods. This rate can compute detailed information about the suit, crew and EVA performance using test-derived algorithms. VIOLET gives LEGACI voice activation capability, allowing the crew to query the suit, and receive feedback and alerts that will lead to corrective action. LEGACI and VIOLET can also automatically control the astronaut's cooling and consumable use rate without crew input if desired. These findings suggest that non-invasive physiological and environmental sensors supported with data analysis can allow for more effective management of mission task performance during EVA. Integrated remote and local view of data metrics allow crewmember to receive real time feedback in synch with mission control in preventing performance shortcomings for EVA in exploration missions.
The contribution of China's Grain to Green Programto carbon and water cycles
NASA Astrophysics Data System (ADS)
Yuan, W.
2017-12-01
The Chinese government started implementation of the Grain for Green Project (GGP) in 1999, aiming to convert cropland to forestland to mitigate soil erosion problems in areas across the country. Although the project has generated substantial environmental benefits, such as erosion reduction, carbon sequestration and water quality improvements, the magnitude of these benefits has not yet been well quantified due to the lack of location specific data describing the afforestation efforts. Remote sensing is well suited to detect afforestation locations, a prerequisite for estimating the impacts of the project on carbon and water cycles. In this study, we first examined the practicability of using the Moderate Resolution Imaging Spectroradiometer (MODIS) land cover product to detect afforestation locations; however, the results showed that the MODIS product failed to distinguish the afforestation areas of GGP. Then, we used a normalized difference vegetation index (NDVI) time series analysis approach for detecting afforestation locations, applying statistical data to determine the NDVI threshold of converted croplands. The technique provided the necessary information for location of afforestation implemented under GGP, explaining 85% of conversion from cropland to forestlands across all provinces. Second, we estimated the changes in carbon fluxes and stocks caused by forests converted from croplands under the GGP using a process-based ecosystem model (i.e., IBIS). Our results showed that the converted areas from croplands to forests under the GGP program could sequester 110.45 Tg C by 2020, and 524.36 Tg C by the end of this century. The sequestration capacity showed substantial spatial variations with large sequestration in southern China. The economic benefits of carbon sequestration from the GGP were also estimated according to the current carbon price. The estimated economic benefits ranged from 8.84 to 44.20 billion from 2000 through 2100, which may exceed the current total investment ($38.99 billion) on the program. As the GGP program continues and forests grow, the impact of this program will be even larger in the future, making a more considerable contribution to China's carbon sink over the upcoming decades.
Implementation and Simulation Results using Autonomous Aerobraking Development Software
NASA Technical Reports Server (NTRS)
Maddock, Robert W.; DwyerCianciolo, Alicia M.; Bowes, Angela; Prince, Jill L. H.; Powell, Richard W.
2011-01-01
An Autonomous Aerobraking software system is currently under development with support from the NASA Engineering and Safety Center (NESC) that would move typically ground-based operations functions to onboard an aerobraking spacecraft, reducing mission risk and mission cost. The suite of software that will enable autonomous aerobraking is the Autonomous Aerobraking Development Software (AADS) and consists of an ephemeris model, onboard atmosphere estimator, temperature and loads prediction, and a maneuver calculation. The software calculates the maneuver time, magnitude and direction commands to maintain the spacecraft periapsis parameters within design structural load and/or thermal constraints. The AADS is currently tested in simulations at Mars, with plans to also evaluate feasibility and performance at Venus and Titan.
The GenABEL Project for statistical genomics
Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381
NASA Astrophysics Data System (ADS)
Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.
Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.
Simple Parametric Model for Airfoil Shape Description
NASA Astrophysics Data System (ADS)
Ziemkiewicz, David
2017-12-01
We show a simple, analytic equation describing a class of two-dimensional shapes well suited for representation of aircraft airfoil profiles. Our goal was to create a description characterized by a small number of parameters with easily understandable meaning, providing a tool to alter the shape with optimization procedures as well as manual tweaks by the designer. The generated shapes are well suited for numerical analysis with 2D flow solving software such as XFOIL.
Ada Programming Support Environment (APSE) Evaluation and Validation (E&V) Team
1991-12-31
standards. The purpose of the team was to assist the project in several ways. Raymond Szymanski of Wright Research Iand Development Center (WRDC, now...debuggers, program library systems, and compiler diagnostics. The test suite does not include explicit tests for the existence of language features . The...support software is a set of tools and procedures which assist in preparing and executing the test suite, in extracting data from the results of
Li, Xiao-jun; Yi, Eugene C; Kemp, Christopher J; Zhang, Hui; Aebersold, Ruedi
2005-09-01
There is an increasing interest in the quantitative proteomic measurement of the protein contents of substantially similar biological samples, e.g. for the analysis of cellular response to perturbations over time or for the discovery of protein biomarkers from clinical samples. Technical limitations of current proteomic platforms such as limited reproducibility and low throughput make this a challenging task. A new LC-MS-based platform is able to generate complex peptide patterns from the analysis of proteolyzed protein samples at high throughput and represents a promising approach for quantitative proteomics. A crucial component of the LC-MS approach is the accurate evaluation of the abundance of detected peptides over many samples and the identification of peptide features that can stratify samples with respect to their genetic, physiological, or environmental origins. We present here a new software suite, SpecArray, that generates a peptide versus sample array from a set of LC-MS data. A peptide array stores the relative abundance of thousands of peptide features in many samples and is in a format identical to that of a gene expression microarray. A peptide array can be subjected to an unsupervised clustering analysis to stratify samples or to a discriminant analysis to identify discriminatory peptide features. We applied the SpecArray to analyze two sets of LC-MS data: one was from four repeat LC-MS analyses of the same glycopeptide sample, and another was from LC-MS analysis of serum samples of five male and five female mice. We demonstrate through these two study cases that the SpecArray software suite can serve as an effective software platform in the LC-MS approach for quantitative proteomics.
Crux: Rapid Open Source Protein Tandem Mass Spectrometry Analysis
2015-01-01
Efficiently and accurately analyzing big protein tandem mass spectrometry data sets requires robust software that incorporates state-of-the-art computational, machine learning, and statistical methods. The Crux mass spectrometry analysis software toolkit (http://cruxtoolkit.sourceforge.net) is an open source project that aims to provide users with a cross-platform suite of analysis tools for interpreting protein mass spectrometry data. PMID:25182276
What is Microsoft EMET and Why Should I Care?
2014-10-22
Headquarters Services , Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should...William 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Software Engineering Institute...with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center sponsored by
Research of TREETOPS Structural Dynamics Controls Simulation Upgrade
NASA Technical Reports Server (NTRS)
Yates, Rose M.
1996-01-01
Under the provisions of contract number NAS8-40194, which was entitled 'TREETOPS Structural Dynamics and Controls Simulation System Upgrade', Oakwood College contracted to produce an upgrade to the existing TREETOPS suite of analysis tools. This suite includes the main simulation program, TREETOPS, two interactive preprocessors, TREESET and TREEFLX, an interactive post processor, TREEPLOT, and an adjunct program, TREESEL. A 'Software Design Document', which provides descriptions of the argument lists and internal variables for each subroutine in the TREETOPS suite, was established. Additionally, installation guides for both DOS and UNIX platforms were developed. Finally, updated User's Manuals, as well as a Theory Manual, were generated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2006-06-30
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st and July 30th 2006. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool. Work is being carried out in Brazil, Belize, Chile, Peru and the USA.« less
A controlled experiment on the impact of software structure on maintainability
NASA Technical Reports Server (NTRS)
Rombach, Dieter H.
1987-01-01
The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.
pyPcazip: A PCA-based toolkit for compression and analysis of molecular simulation data
NASA Astrophysics Data System (ADS)
Shkurti, Ardita; Goni, Ramon; Andrio, Pau; Breitmoser, Elena; Bethune, Iain; Orozco, Modesto; Laughton, Charles A.
The biomolecular simulation community is currently in need of novel and optimised software tools that can analyse and process, in reasonable timescales, the large generated amounts of molecular simulation data. In light of this, we have developed and present here pyPcazip: a suite of software tools for compression and analysis of molecular dynamics (MD) simulation data. The software is compatible with trajectory file formats generated by most contemporary MD engines such as AMBER, CHARMM, GROMACS and NAMD, and is MPI parallelised to permit the efficient processing of very large datasets. pyPcazip is a Unix based open-source software (BSD licenced) written in Python.
DSPSR: Digital Signal Processing Software for Pulsar Astronomy
NASA Astrophysics Data System (ADS)
van Straten, W.; Bailes, M.
2010-10-01
DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.
IRACproc: IRAC Post-BCD Processing
NASA Astrophysics Data System (ADS)
Schuster, Mike; Marengo, Massimo; Patten, Brian
2012-09-01
IRACproc is a software suite that facilitates the co-addition of dithered or mapped Spitzer/IRAC data to make them ready for further analysis with application to a wide variety of IRAC observing programs. The software runs within PDL, a numeric extension for Perl available from pdl.perl.org, and as stand alone perl scripts. In acting as a wrapper for the Spitzer Science Center's MOPEX software, IRACproc improves the rejection of cosmic rays and other transients in the co-added data. In addition, IRACproc performs (optional) Point Spread Function (PSF) fitting, subtraction, and masking of saturated stars.
Big Sky Carbon Sequestration Partnership
DOE Office of Scientific and Technical Information (OSTI.GOV)
Susan Capalbo
2005-12-31
The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessmentmore » framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated has significant potential to sequester large amounts of CO{sub 2}. Simulations conducted to evaluate mineral trapping potential of mafic volcanic rock formations located in the Idaho province suggest that supercritical CO{sub 2} is converted to solid carbonate mineral within a few hundred years and permanently entombs the carbon. Although MMV for this rock type may be challenging, a carefully chosen combination of geophysical and geochemical techniques should allow assessment of the fate of CO{sub 2} in deep basalt hosted aquifers. Terrestrial carbon sequestration relies on land management practices and technologies to remove atmospheric CO{sub 2} where it is stored in trees, plants, and soil. This indirect sequestration can be implemented today and is on the front line of voluntary, market-based approaches to reduce CO{sub 2} emissions. Initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil Carbon (C) on rangelands, and forested, agricultural, and reclaimed lands. Rangelands can store up to an additional 0.05 mt C/ha/yr, while the croplands are on average four times that amount. Estimates of technical potential for soil sequestration within the region in cropland are in the range of 2.0 M mt C/yr over 20 year time horizon. This is equivalent to approximately 7.0 M mt CO{sub 2}e/yr. The forestry sinks are well documented, and the potential in the Big Sky region ranges from 9-15 M mt CO{sub 2} equivalent per year. Value-added benefits include enhanced yields, reduced erosion, and increased wildlife habitat. Thus the terrestrial sinks provide a viable, environmentally beneficial, and relatively low cost sink that is available to sequester C in the current time frame. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological and terrestrial sequestration reflect this concern. Research in Phase I has identified and validated best management practices for soil C in the Partnership region, and outlined a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long-term viability. This is the basis for the integrative analysis that will be undertaken in Phase II to work with industry, state and local governments and with the pilot demonstration projects to quantify the economic costs and risks associated with all opportunities for carbon storage in the Big Sky region. Scientifically sound MMV is critical for public acceptance of these technologies.« less
The vTAS suite: A simulator for classical and multiplexed three-axis neutron spectrometers
NASA Astrophysics Data System (ADS)
Boehm, M.; Filhol, A.; Raoul, Y.; Kulda, J.; Schmidt, W.; Schmalzl, K.; Farhi, E.
2013-01-01
The vTAS suite provides graphical assistance to prepare and perform inelastic neutron scattering experiments on a TAS instrument, including latest multiplexed instrumental configurations, such as FlatCone, IMPS and UFO. The interactive display allows for flexible translation between instrument positions in real space and neutron scattering conditions represented in reciprocal space. It is a platform independent public domain software tool, available for download from the website of the Institut Laue Langevin (ILL).
Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens
2013-01-01
We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215
Freud: a software suite for high-throughput simulation analysis
NASA Astrophysics Data System (ADS)
Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon
Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.
Managing Critical Infrastructures C.I.M. Suite
Dudenhoeffer, Donald
2018-05-23
See how a new software package developed by INL researchers could help protect infrastructure during natural disasters, terrorist attacks and electrical outages. For more information about INL research, visit http://www.facebook.com/idahonationallaboratory.
Sandia Engineering Analysis Code Access System v. 2.0.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjaardema, Gregory D.
The Sandia Engineering Analysis Code Access System (SEACAS) is a suite of preprocessing, post processing, translation, visualization, and utility applications supporting finite element analysis software using the Exodus database file format.
NASA Astrophysics Data System (ADS)
Wu, S.; Romanak, K.; Yang, C.
2009-12-01
We report the development of two methods for subsurface monitoring of CO2 in both air and water phases at sequestration sites. The first method is based on line-of-sight (LOS) tunable laser spectroscopy. Funded by DOE, we demonstrated the Phase Insensitive Two Tone Frquency Modulation spectroscopy (PITTFM). FM reduces low frequency noise in the beam path due to scintillations; while the PI design gives the ease of installation. We demonstrated measurement over 1 mile distance with an accuracy of 3ppm of CO2 in normal air. Built-in switches shoot the laser beam into multi-directions, thus forming a cellular monitoring network covering 10 km^2. The system cost is under $100K, and COTS telecom components guarantee the reliability in the field over decades. Software will log the data and translate the 2D CO2 profile. When coupled with other parameters, it will be able to locate the point and rate of leakages. Field tests at SECARB sequestration site are proposed. The system also monitors other green house gases (GHG), e.g. CH4, which is also needed where EOR is pursued along with CO2 sequestration. Figures 1 through 2 give the results of this method. The second method is based on the latest technology advances in quantum cascade lasers (QCLs). The current state of the art technology to measure Total/Dissolved Inorganic Carbon (TIC/DIC) in water is menometer. Menometer is both time consuming and costly, and could not be used underground, i.e. high pressure and temperature. We propose to use high brightness QC lasers to extend the current Mid-IR optical path from 30 microns to over 500microns, thus providing the possibility to measure CO2 dissoveled (Aqueous phase) with an accuracy of 0.2mg/Liter. Preliminary results will be presented.
Reverse engineering of integrated circuits
Chisholm, Gregory H.; Eckmann, Steven T.; Lain, Christopher M.; Veroff, Robert L.
2003-01-01
Software and a method therein to analyze circuits. The software comprises several tools, each of which perform particular functions in the Reverse Engineering process. The analyst, through a standard interface, directs each tool to the portion of the task to which it is most well suited, rendering previously intractable problems solvable. The tools are generally used iteratively to produce a successively more abstract picture of a circuit, about which incomplete a priori knowledge exists.
ERIC Educational Resources Information Center
Careless, James
2007-01-01
Enterprise resource planning (ERP) software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening…
ERIC Educational Resources Information Center
Careless, James
2007-01-01
Enterprise resource planning software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening in the…
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice; Baggs, Rhoda
2007-01-01
Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.
Free software for performing physical analysis of systems for digital radiography and mammography.
Donini, Bruno; Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco
2014-05-01
In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online (www.medphys.it/downloads.htm). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.
NASA Astrophysics Data System (ADS)
Percy Plasencia Linares, Milton; Russi, Marino; Pesaresi, Damiano; Cravos, Claudio
2010-05-01
The Italian National Institute for Oceanography and Experimental Geophysics (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale, OGS) is running the Antarctic Seismographic Argentinean Italian Network (ASAIN), made of 7 seismic stations located in the Scotia Sea region in Antarctica and in Tierra del Fuego - Argentina: data from these stations are transferred in real time to the OGS headquarters in Trieste (Italy) via satellite links provided by the Instituto Antártico Argentino (IAA). Data is collected and archived primarily in Güralp Compress Format (GCF) through the Scream! software at OGS and IAA, and transmitted also in real time to the Observatories and Research Facilities for European Seismology (ORFEUS). The main real time seismic data acquisition and processing system of the ASAIN network is based on the EarthWorm 7.3 (Open Source) software suite installed on a Linux server at the OGS headquarters in Trieste. It runs several software modules for data collection, data archiving, data publication on dedicated web servers: wave_serverV, Winston Wave Server, and data analysis and realtime monitoring through Swarm program. OGS is also running, in close cooperation with the Friuli-Venezia Giulia Civil Defense, the North East (NI) Italy seismic network, making use of the Antelope commercial software suite from BRTT as the main acquisition system. As a test to check the global capabilities of the Antelope software suite, we also set up an instance of Antelope acquiring data in real time from both the regional ASAIN seismic network in Antarctica and a subset of the Global Seismic Network (GSN) funded by the Incorporated Research Institution for Seismology (IRIS). The facilities of the IRIS Data Management System, and specifically the IRIS Data Management Center, were used for real time access to waveform required in this study. The first tests indicated that more than 80% of the earthquakes with magnitude M>5.0 listed in the Preliminary Determination of Epicenters (PDE) catalogue of the National Earthquake Information Center (NEIC) of the United States Geological Survey (USGS) were also correctly automatically detected by Antelope, with an average location error of 0.05 degrees and average body wave magnitude Mb estimation error below 0.1. The average time difference between event origin time and the actual time of event determination by Antelope was of about 45': the comparison with 20', the IASPEI91 P-wave travel time for 180 degrees distance, and 25', the estimate of our test system data latency, indicate that Antelope is a serious candidate for regional and global early warning systems.
INL Generic Robot Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
2005-03-30
The INL Generic Robot Architecture is a generic, extensible software framework that can be applied across a variety of different robot geometries, sensor suites and low-level proprietary control application programming interfaces (e.g. mobility, aria, aware, player, etc.).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Sandra Brown; Zoe Kant
2009-01-07
The Nature Conservancy participated in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project was 'Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration'. The objectives of the project were to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects, providingmore » new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Final Technical Report discusses the results of the six tasks that The Nature Conservancy undertook to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between July 1st 2001 and July 10th 2008. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool. The project occurred in two phases. The first was a focused exploration of specific carbon measurement and monitoring methodologies and pre-selected carbon sequestration opportunities. The second was a more systematic and comprehensive approach to compare various competing measurement and monitoring methodologies, and assessment of a variety of carbon sequestration opportunities in order to find those that are the lowest cost with the greatest combined carbon and other environmental benefits. In the first phase we worked in the U.S., Brazil, Belize, Bolivia, Peru, and Chile to develop and refine specific carbon inventory methods, pioneering a new remote-sensing method for cost-effectively measuring and monitoring terrestrial carbon sequestration and system for developing carbon baselines for both avoided deforestation and afforestation/reforestation projects. We evaluated the costs and carbon benefits of a number of specific terrestrial carbon sequestration activities throughout the U.S., including reforestation of abandoned mined lands in southwest Virginia, grassland restoration in Arizona and Indiana, and reforestation in the Mississippi Alluvial Delta. The most cost-effective U.S. terrestrial sequestration opportunity we found through these studies was reforestation in the Mississippi Alluvial Delta. In Phase II we conducted a more systematic assessment and comparison of several different measurement and monitoring approaches in the Northern Cascades of California, and a broad 11-state Northeast regional assessment, rather than pre-selected and targeted, analysis of terrestrial sequestration costs and benefits. Work was carried out in Brazil, Belize, Chile, Peru and the USA. Partners include the Winrock International Institute for Agricultural Development, The Sampson Group, Programme for Belize, Society for Wildlife Conservation (SPVS), Universidad Austral de Chile, Michael Lefsky, Colorado State University, UC Berkeley, the Carnegie Institution of Washington, ProNaturaleza, Ohio State University, Stephen F. Austin University, Geographical Modeling Services, Inc., WestWater, Los Alamos National Laboratory, Century Ecosystem Services, Mirant Corporation, General Motors, American Electric Power, Salt River Project, Applied Energy Systems, KeySpan, NiSource, and PSEG. This project, 'Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration', has resulted in over 50 presentations and reports, available publicly through the Department of Energy or by visiting the links listed in Appendix 1. More important than the reports, the project has helped to lead to the development of on-the-ground projects in Southwestern Virginia, Louisiana, and Chile while informing policy development in Virginia, the Regional Greenhouse Gas Initiative, the California Climate Action Registry and U.S. and international programs.« less
Integrating open-source software applications to build molecular dynamics systems.
Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej
2014-04-05
Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.
Mobile Vehicle Teleoperated Over Wireless IP
2007-06-13
VideoLAN software suite. The VLC media player portion of this suite handles net- work streaming of video, as well as the receipt and display of the video...is found in appendix C.7. Video Display The video feed is displayed for the operator using VLC opened independently from the control sending program...This gives the operator the most choice in how to configure the display. To connect VLC to the feed all you need is the IP address from the Java
Implementation and Testing of VLBI Software Correlation at the USNO
NASA Technical Reports Server (NTRS)
Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken
2010-01-01
The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.
Value Addition to Cartosat-I Imagery
NASA Astrophysics Data System (ADS)
Mohan, M.
2014-11-01
In the sector of remote sensing applications, the use of stereo data is on the steady rise. An attempt is hereby made to develop a software suite specifically for exploitation of Cartosat-I data. A few algorithms to enhance the quality of basic Cartosat-I products will be presented. The algorithms heavily exploit the Rational Function Coefficients (RPCs) that are associated with the image. The algorithms include improving the geometric positioning through Bundle Block Adjustment and producing refined RPCs; generating portable stereo views using raw / refined RPCs autonomously; orthorectification and mosaicing; registering a monoscopic image rapidly with a single seed point. The outputs of these modules (including the refined RPCs) are in standard formats for further exploitation in 3rd party software. The design focus has been on minimizing the user-interaction and to customize heavily to suit the Indian context. The core libraries are in C/C++ and some of the applications come with user-friendly GUI. Further customization to suit a specific workflow is feasible as the requisite photogrammetric tools are in place and are continuously upgraded. The paper discusses the algorithms and the design considerations of developing the tools. The value-added products so produced using these tools will also be presented.
RSAT 2018: regulatory sequence analysis tools 20th anniversary.
Nguyen, Nga Thi Thuy; Contreras-Moreira, Bruno; Castro-Mondragon, Jaime A; Santana-Garcia, Walter; Ossio, Raul; Robles-Espinoza, Carla Daniela; Bahin, Mathieu; Collombet, Samuel; Vincens, Pierre; Thieffry, Denis; van Helden, Jacques; Medina-Rivera, Alejandra; Thomas-Chollier, Morgane
2018-05-02
RSAT (Regulatory Sequence Analysis Tools) is a suite of modular tools for the detection and the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, including from genome-wide datasets like ChIP-seq/ATAC-seq, (ii) motif scanning, (iii) motif analysis (quality assessment, comparisons and clustering), (iv) analysis of regulatory variations, (v) comparative genomics. Six public servers jointly support 10 000 genomes from all kingdoms. Six novel or refactored programs have been added since the 2015 NAR Web Software Issue, including updated programs to analyse regulatory variants (retrieve-variation-seq, variation-scan, convert-variations), along with tools to extract sequences from a list of coordinates (retrieve-seq-bed), to select motifs from motif collections (retrieve-matrix), and to extract orthologs based on Ensembl Compara (get-orthologs-compara). Three use cases illustrate the integration of new and refactored tools to the suite. This Anniversary update gives a 20-year perspective on the software suite. RSAT is well-documented and available through Web sites, SOAP/WSDL (Simple Object Access Protocol/Web Services Description Language) web services, virtual machines and stand-alone programs at http://www.rsat.eu/.
SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology.
Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E; Troein, Carl; Millar, Andrew J; Goryanin, Igor; Gilmore, Stephen
2013-03-01
Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI's use of standard data formats. All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials.
PSC, a Programmable Software Controller for a Multiple Bladder, Sequentially Inflatable G-Suit.
1983-12-01
Valves . For inflation and deflation, industrial soleniod pilot valves provide filling and dumping via a manually thrown three -poition switch...medicine with a tool for performing that research. This research concerns itself with developing a programmable valve actuation controller generic to g...Subsystem 2 - Software Controller ......... -5 %o Subsystem 3 - Cromemco D/7A S-100 Bus S y m Conversion Board ....o...... -6 Subsyst 4 Computer/ Valve
Framework for ReSTful Web Services in OSGi
NASA Technical Reports Server (NTRS)
Shams, Khawaja S.; Norris, Jeffrey S.; Powell, Mark W.; Crockett, Thomas M.; Mittman, David S.; Fox, Jason M.; Joswig, Joseph C.; Wallick, Michael N.; Torres, Recaredo J.; Rabe, Kenneth
2009-01-01
Ensemble ReST is a software system that eases the development, deployment, and maintenance of server-side application programs to perform functions that would otherwise be performed by client software. Ensemble ReST takes advantage of the proven disciplines of ReST (Representational State Transfer. ReST leverages the standardized HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites
DOE Office of Scientific and Technical Information (OSTI.GOV)
P-Mart was designed specifically to allow cancer researchers to perform robust statistical processing of publicly available cancer proteomic datasets. To date an online statistical processing suite for proteomics does not exist. The P-Mart software is designed to allow statistical programmers to utilize these algorithms through packages in the R programming language as well as offering a web-based interface using the Azure cloud technology. The Azure cloud technology also allows the release of the software via Docker containers.
Assessment of CTAS ETA prediction capabilities
NASA Astrophysics Data System (ADS)
Bolender, Michael A.
1994-11-01
This report summarizes the work done to date in assessing the trajectory fidelity and estimated time of arrival (ETA) prediction capability of the NASA Ames Center TRACON Automation System (CTAS) software. The CTAS software suite is a series of computer programs designed to aid air traffic controllers in their tasks of safely scheduling the landing sequence of approaching aircraft. in particular, this report concerns the accuracy of the available measurements (e.g., position, altitude, etc.) that are input to the software, as well as the accuracy of the final data that is made available to the air traffic controllers.
Aviation Environmental Design Tool (AEDT) System Architecture
DOT National Transportation Integrated Search
2007-01-29
The Federal Aviation Administration's Office of Environment and Energy (FAA-AEE) is : developing a comprehensive suite of software tools that will allow for thorough assessment of the environmental effects of aviation. The main goal of the effort is ...
Neutron probes for the Construction and Resource Utilization eXplorer (CRUX)
NASA Technical Reports Server (NTRS)
Elphic, R. C.; Hahn, S.; Lawrence, D. J.; Feldman, W. C.; Johnson, J. B.; Haldemann, A. F. C.
2006-01-01
The Construction and Resource Utilization eXplorer (CRUX) project is developing a flexible integrated suite of instruments with data fusion software and an executive controller for in situ regolith resource assessment and characterization.
Part of the CAMEO suite, MARPLOT® is a mapping application that people can use to quickly create, view, and modify maps. Users can create their own objects in MARPLOT (e.g., facilities, schools, response assets) and display them on top of a basemap.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopes, M. L.
2014-07-01
SolCalc is a software suite that computes and displays magnetic fields generated by a three dimensional (3D) solenoid system. Examples of such systems are the Mu2e magnet system and Helical Solenoids for muon cooling systems. SolCalc was originally coded in Matlab, and later upgraded to a compiled version (called MEX) to improve solving speed. Matlab was chosen because its graphical capabilities represent an attractive feature over other computer languages. Solenoid geometries can be created using any text editor or spread sheets and can be displayed dynamically in 3D. Fields are computed from any given list of coordinates. The field distributionmore » on the surfaces of the coils can be displayed as well. SolCalc was benchmarked against a well-known commercial software for speed and accuracy and the results compared favorably.« less
An overview of suite for automated global electronic biosurveillance (SAGES)
NASA Astrophysics Data System (ADS)
Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.
2012-06-01
Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.
Big Sky Carbon Sequestration Partnership
DOE Office of Scientific and Technical Information (OSTI.GOV)
Susan M. Capalbo
2005-11-01
The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork ismore » in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the Partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long-term viability. Scientifically sound MMV is critical for public acceptance of these technologies. Deliverables for the 7th Quarter reporting period include (1) for the geological efforts: Reports on Technology Needs and Action Plan on the Evaluation of Geological Sinks and Pilot Project Deployment (Deliverables 2 and 3), and Report on the Feasibility of Mineralization Trapping in the Snake River Plain Basin (Deliverable 14); (2) for the terrestrial efforts: Report on the Evaluation of Terrestrial Sinks and a Report of the Best Production Practices for Soil C Sequestration (Deliverables 8 and 15). In addition, the 7th Quarter activities for the Partnership included further development of the proposed activities for the deployment and demonstration phase of the carbon sequestration pilots including geological and terrestrial pilots, expansion of the Partnership to encompass regions and institutions that are complimentary to the steps we have identified, building greater collaborations with industry and stakeholders in the region, contributed to outreach efforts that spanned all partnerships, co-authorship on the Carbon Capture and Separation report, and developed a regional basis to address future energy opportunities in the region. The deliverables and activities are discussed in the following sections and appended to this report. The education and outreach efforts have resulted in a comprehensive plan which serves as a guide for implementing the outreach activities under Phase I. The public website has been expanded and integrated with the GIS carbon atlas. We have made presentations to stakeholders and policy makers including two tribal sequestration workshops, and made connections to other federal and state agencies concerned with GHG emissions, climate change, and efficient and environmentally-friendly energy production. In addition, the Partnership has plans for integration of our outreach efforts with students, especially at the tribal colleges and at the universities involved in our Partnership. This includes collaboration with MSU and with the U.S.-Norway Summer School, extended outreach efforts at LANL and INEEL, and with the student section of the ASME. Finally, the Big Sky Partnership was involved in key meetings and symposium in the 7th quarter including the USDOE Wye Institute Conference on Carbon Sequestration and Capture (April, 2005); the DOE/NETL Fourth Annual Conference on Carbon Capture and Sequestration (May 2005); Coal Power Development Conference (Denver, June 2005) and meetings with our Phase II industry partners and Governor's staff.« less
Integration and validation of a data grid software
NASA Astrophysics Data System (ADS)
Carenton-Madiec, Nicolas; Berger, Katharina; Cofino, Antonio
2014-05-01
The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) is a software infrastructure for the management, dissemination, and analysis of model output and observational data. The ESGF grid is composed with several types of nodes which have different roles. About 40 data nodes host model outputs and datasets using thredds catalogs. About 25 compute nodes offer remote visualization and analysis tools. About 15 index nodes crawl data nodes catalogs and implement faceted and federated search in a web interface. About 15 Identity providers nodes manage accounts, authentication and authorization. Here we will present an actual size test federation spread across different institutes in different countries and a python test suite that were started in December 2013. The first objective of the test suite is to provide a simple tool that helps to test and validate a single data node and its closest index, compute and identity provider peer. The next objective will be to run this test suite on every data node of the federation and therefore test and validate every single node of the whole federation. The suite already implements nosetests, requests, myproxy-logon, subprocess, selenium and fabric python libraries in order to test both web front ends, back ends and security services. The goal of this project is to improve the quality of deliverable in a small developers team context. Developers are widely spread around the world working collaboratively and without hierarchy. This kind of working organization context en-lighted the need of a federated integration test and validation process.
Extraction and Analysis of Display Data
NASA Technical Reports Server (NTRS)
Land, Chris; Moye, Kathryn
2008-01-01
The Display Audit Suite is an integrated package of software tools that partly automates the detection of Portable Computer System (PCS) Display errors. [PCS is a lap top computer used onboard the International Space Station (ISS).] The need for automation stems from the large quantity of PCS displays (6,000+, with 1,000,000+ lines of command and telemetry data). The Display Audit Suite includes data-extraction tools, automatic error detection tools, and database tools for generating analysis spread sheets. These spread sheets allow engineers to more easily identify many different kinds of possible errors. The Suite supports over 40 independent analyses, 16 NASA Tech Briefs, November 2008 and complements formal testing by being comprehensive (all displays can be checked) and by revealing errors that are difficult to detect via test. In addition, the Suite can be run early in the development cycle to find and correct errors in advance of testing.
The Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD) Tool
Providing quantal response models, which are also used in the U.S. EPA benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates.
TTCI's Scientific Software Suite and NUCARS Overview
DOT National Transportation Integrated Search
2015-06-30
On June 30-July 1 of 2015 the FRA held the Best Practices Workshop on VTI Simulation at the Volpe Center in Cambridge, Massachusetts. The two day workshop was attended by representatives from the government, code developers, researchers, academia, an...
General purpose nonlinear system solver based on Newton-Krylov method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2013-12-01
KINSOL is part of a software family called SUNDIALS: SUite of Nonlinear and Differential/Algebraic equation Solvers [1]. KINSOL is a general-purpose nonlinear system solver based on Newton-Krylov and fixed-point solver technologies [2].
Human Engineering Modeling and Performance Lab Study Project
NASA Technical Reports Server (NTRS)
Oliva-Buisson, Yvette J.
2014-01-01
The HEMAP (Human Engineering Modeling and Performance) Lab is a joint effort between the Industrial and Human Engineering group and the KAVE (Kennedy Advanced Visualiations Environment) group. The lab consists of sixteen camera system that is used to capture human motions and operational tasks, through te use of a Velcro suit equipped with sensors, and then simulate these tasks in an ergonomic software package know as Jac, The Jack software is able to identify the potential risk hazards.
eXtended CASA Line Analysis Software Suite (XCLASS)
NASA Astrophysics Data System (ADS)
Möller, T.; Endres, C.; Schilke, P.
2017-02-01
The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7
COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA
Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.
2011-01-01
Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793
Gendermetrics.NET: a novel software for analyzing the gender representation in scientific authoring.
Bendels, Michael H K; Brüggmann, Dörthe; Schöffel, Norman; Groneberg, David A
2016-01-01
Imbalances in female career promotion are believed to be strong in the field of academic science. A primary parameter to analyze gender inequalities is the gender authoring in scientific publications. Since the presently available data on gender distribution is largely limited to underpowered studies, we here develop a new approach to analyze authors' genders in large bibliometric databases. A SQL-Server based multiuser software suite was developed that serves as an integrative tool for analyzing bibliometric data with a special emphasis on gender and topographical analysis. The presented system allows seamless integration, inspection, modification, evaluation and visualization of bibliometric data. By providing an adaptive and almost fully automatic integration and analysis process, the inter-individual variability of analysis is kept at a low level. Depending on the scientific question, the system enables the user to perform a scientometric analysis including its visualization within a short period of time. In summary, a new software suite for analyzing gender representations in scientific articles was established. The system is suitable for the comparative analysis of scientific structures on the level of continents, countries, cities, city regions, institutions, research fields and journals.
SkZpipe: A Python3 module to produce efficiently PSF-fitting photometry with DAOPHOT, and much more
NASA Astrophysics Data System (ADS)
Mauro, F.
2017-07-01
In an era characterized by big sky surveys and the availability of large amount of photometric data, it is important for astronomers to have tools to process their data in an efficient, accurate and easy way, minimizing reduction time. We present SkZpipe, a Python3 module designed mainly to process generic data, performing point-spread function (PSF) fitting photometry with the DAOPHOT suite (Stetson 1987). The software has already demonstrated its accuracy and efficiency with the adaptation VVV-SkZ_pipeline (Mauro et al. 2013) for the "VISTA Variables in the Vía Láctea" ESO survey, showing how it can replace the users, avoiding repetitive interaction in all the operations, retaining all of the benefits of the power and accuracy of the DAOPHOT suite, detaching them from the burden of data precessing. This software provides not only a pipeline, but also all the tools to run easily each atomic step of the photometric procedure, to match the results, and to retrieve information from fits headers and the internal instrumental database. We plan to add the support to other photometric softwares in the future.
An Improved Suite of Object Oriented Software Measures
NASA Technical Reports Server (NTRS)
Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.
1997-01-01
In the pursuit of ever increasing productivity, the need to be able to measure specific aspects of software is generally agreed upon. As object oriented programming languages are becoming more and more widely used, metrics specifically designed for object oriented software are required. In recent years there has been an explosion of new, object oriented software metrics proposed in the literature. Unfortunately, many or most of these proposed metrics have not been validated to measure what they claim to measure. In fact, an analysis of many of these metrics shows that they do not satisfy basic properties of measurement theory, and thus their application has to be suspect. In this paper ten improved metrics are proposed and are validated using measurement theory.
ARROWSMITH-P: A prototype expert system for software engineering management
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Ramsey, Connie Loggia
1985-01-01
Although the field of software engineering is relatively new, it can benefit from the use of expert systems. Two prototype expert systems were developed to aid in software engineering management. Given the values for certain metrics, these systems will provide interpretations which explain any abnormal patterns of these values during the development of a software project. The two systems, which solve the same problem, were built using different methods, rule-based deduction and frame-based abduction. A comparison was done to see which method was better suited to the needs of this field. It was found that both systems performed moderately well, but the rule-based deduction system using simple rules provided more complete solutions than did the frame-based abduction system.
Extravehicular Activity (EVA) Power, Avionics, and Software (PAS) 101
NASA Technical Reports Server (NTRS)
Irimies, David
2011-01-01
EVA systems consist of a spacesuit or garment, a PLSS, a PAS system, and spacesuit interface hardware. The PAS system is responsible for providing power for the suit, communication of several types of data between the suit and other mission assets, avionics hardware to perform numerous data display and processing functions, and information systems that provide crewmembers data to perform their tasks with more autonomy and efficiency. Irimies discussed how technology development efforts have advanced the state-of-the-art in these areas and shared technology development challenges.
Unraveling transcriptional control and cis-regulatory codes using the software suite GeneACT
Cheung, Tom Hiu; Kwan, Yin Lam; Hamady, Micah; Liu, Xuedong
2006-01-01
Deciphering gene regulatory networks requires the systematic identification of functional cis-acting regulatory elements. We present a suite of web-based bioinformatics tools, called GeneACT , that can rapidly detect evolutionarily conserved transcription factor binding sites or microRNA target sites that are either unique or over-represented in differentially expressed genes from DNA microarray data. GeneACT provides graphic visualization and extraction of common regulatory sequence elements in the promoters and 3'-untranslated regions that are conserved across multiple mammalian species. PMID:17064417
Simplified Deployment of Health Informatics Applications by Providing Docker Images.
Löbe, Matthias; Ganslandt, Thomas; Lotzmann, Lydia; Mate, Sebastian; Christoph, Jan; Baum, Benjamin; Sariyar, Murat; Wu, Jie; Stäubert, Sebastian
2016-01-01
Due to the specific needs of biomedical researchers, in-house development of software is widespread. A common problem is to maintain and enhance software after the funded project has ended. Even if many tools are made open source, only a couple of projects manage to attract a user basis large enough to ensure sustainability. Reasons for this include complex installation and configuration of biomedical software as well as an ambiguous terminology of the features provided; all of which make evaluation of software laborious. Docker is a para-virtualization technology based on Linux containers that eases deployment of applications and facilitates evaluation. We investigated a suite of software developments funded by a large umbrella organization for networked medical research within the last 10 years and created Docker containers for a number of applications to support utilization and dissemination.
NASA Technical Reports Server (NTRS)
1989-01-01
001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.
Analytical Tools for Space Suit Design
NASA Technical Reports Server (NTRS)
Aitchison, Lindsay
2011-01-01
As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.
Echelle Data Reduction Cookbook
NASA Astrophysics Data System (ADS)
Clayton, Martin
This document is the first version of the Starlink Echelle Data Reduction Cookbook. It contains scripts and procedures developed by regular or heavy users of the existing software packages. These scripts are generally of two types; templates which readers may be able to modify to suit their particular needs and utilities which carry out a particular common task and can probably be used `off-the-shelf'. In the nature of this subject the recipes given are quite strongly tied to the software packages, rather than being science-data led. The major part of this document is divided into two sections dealing with scripts to be used with IRAF and with Starlink software (SUN/1).
ORBS: A reduction software for SITELLE and SpiOMM data
NASA Astrophysics Data System (ADS)
Martin, Thomas
2014-09-01
ORBS merges, corrects, transforms and calibrates interferometric data cubes and produces a spectral cube of the observed region for analysis. It is a fully automatic data reduction software for use with SITELLE (installed at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont Mégantic); these imaging Fourier transform spectrometers obtain a hyperspectral data cube which samples a 12 arc-minutes field of view into 4 millions of visible spectra. ORBS is highly parallelized; its core classes (ORB) have been designed to be used in a suite of softwares for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS).
Analysis of High-Throughput ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Zangar, Richard C.
Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).
An online database for plant image analysis software tools.
Lobet, Guillaume; Draye, Xavier; Périlleux, Claire
2013-10-09
Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.
BIG SKY CARBON SEQUESTRATION PARTNERSHIP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Susan M. Capalbo
2004-10-31
The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources foundmore » in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the Partnership region, and to design a risk/cost effectiveness framework to make comparative assessments of each viable sink, taking into account economic costs, offsetting benefits, scale of sequestration opportunities, spatial and time dimensions, environmental risks, and long-term viability. Scientifically sound information on MMV is critical for public acceptance of these technologies. Two key deliverables were completed in the second quarter--a literature review/database to assess the soil carbon on rangelands, and the draft protocols, contracting options for soil carbon trading. The protocols developed for soil carbon trading are unique and provide a key component of the mechanisms that might be used to efficiently sequester GHG and reduce CO{sub 2} concentrations. While no key deliverables were due during the third quarter, progress on other deliverables is noted in the PowerPoint presentations and in this report. A series of meetings held during the second and third quarters have laid the foundations for assessing the issues surrounding carbon sequestration in this region, the need for a holistic approach to meeting energy demands and economic development potential, and the implementation of government programs or a market-based setting for soil C credits. These meetings provide a connection to stakeholders in the region and a basis on which to draw for the DOE PEIS hearings. In the fourth quarter, three deliverables have been completed, some in draft form to be revised and updated to include Wyoming. This is due primarily to some delays in funding to LANL and INEEL and the approval of a supplemental proposal to include Wyoming in much of the GIS data sets, analysis, and related materials. The deliverables are discussed in the following sections and greater details are provided in the materials that are attached to this report. In August 2004, a presentation was made to Pioneer Hi-Bred, discussing the Partnership and the synergies with terrestrial sequestration, agricultural industries, and ongoing, complimentary USDA efforts. The Partnership organized a Carbon session at the INRA 2004 Environmental and Subsurface Science Symposium in September 2004; also in September, a presentation was made to the Wyoming Carbon Sequestration Advisory Committee, followed up with a roundtable discussion.« less
1986-05-01
offering the course is a company. Name and Address of offeror: Tachyon Corporation 2725 Congress Street Suite 2H San Diego, CA 92110 Offeror’s...Background: Tachyon Corporation specializes in Ada software quality assurance, computer hosted instruction and information retrieval systems, authoring tools...easy to use (on-line help) and can look up or search for terms. Tachyon Corporation 20 CDURSE OFFERINGS 2.2. Lecture/Seminar Courses 2.2.1. Company
Programs for Testing an SSME-Monitoring System
NASA Technical Reports Server (NTRS)
Lang, Andre; Cecil, Jimmie; Heusinger, Ralph; Freestone, Kathleen; Blue, Lisa; Wilkerson, DeLisa; McMahon, Leigh Anne; Hall, Richard B.; Varnavas, Kosta; Smith, Keary;
2007-01-01
A suite of computer programs has been developed for special test equipment (STE) that is used in verification testing of the Health Management Computer Integrated Rack Assembly (HMCIRA), a ground-based system of analog and digital electronic hardware and software for "flight-like" testing for development of components of an advanced health-management system for the space shuttle main engine (SSME). The STE software enables the STE to simulate the analog input and the data flow of an SSME test firing from start to finish.
"HIP" new software: The Hydroecological Integrity Assessment Process
Henriksen, Jim; Wilson, Juliette T.
2006-01-01
Center (FORT) have developed the Hydroecological Integrity Assessment Process (HIP) and a suite of software tools for conducting a hydrologic classification of streams, addressing instream flow needs, and assessing past and proposed hydrologic alterations on streamflow and other ecosystem components. The HIP recognizes that streamflow is strongly related to many critical physiochemical components of rivers, such as dissolved oxygen, channel geomorphology, and habitats. Streamflow is considered a “master variable” that limits the distribution, abundance, and diversity of many aquatic plant and animal species.
The Software Distribution for Gemini Observatory's Science Operations Group
NASA Astrophysics Data System (ADS)
Hoenig, M. D.; Clarke, M.; Pohlen, M.; Hirst, P.
2014-05-01
Gemini Observatory consists of two telescopes in different hemispheres. It also operates mostly on a queue observing model, meaning observations are performed by staff working shifts as opposed to PIs. For these two reasons alone, maintaining and distributing a diverse software suite is not a trivial matter. We present a way to make the appropriate tools available to staff at Gemini North and South, whether they are working on the summit or from our base facility offices in Hilo, Hawai'i and La Serena, Chile.
Livermore Compiler Analysis Loop Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hornung, R. D.
2013-03-01
LCALS is designed to evaluate compiler optimizations and performance of a variety of loop kernels and loop traversal software constructs. Some of the loop kernels are pulled directly from "Livermore Loops Coded in C", developed at LLNL (see item 11 below for details of earlier code versions). The older suites were used to evaluate floating-point performances of hardware platforms prior to porting larger application codes. The LCALS suite is geared toward assissing C++ compiler optimizations and platform performance related to SIMD vectorization, OpenMP threading, and advanced C++ language features. LCALS contains 20 of 24 loop kernels from the older Livermoremore » Loop suites, plus various others representative of loops found in current production appkication codes at LLNL. The latter loops emphasize more diverse loop constructs and data access patterns than the others, such as multi-dimensional difference stencils. The loops are included in a configurable framework, which allows control of compilation, loop sampling for execution timing, which loops are run and their lengths. It generates timing statistics for analysis and comparing variants of individual loops. Also, it is easy to add loops to the suite as desired.« less
NASA Astrophysics Data System (ADS)
Di Vittorio, A. V.; Simmonds, M.; Nico, P. S.
2017-12-01
Land-based carbon sequestration and GreenHouse Gas (GHG) reduction strategies are often implemented in small patches and evaluated independently from each other, which poses several challenges to determining their potential benefits at the regional scales at which carbon/GHG targets are defined. These challenges include inconsistent methods, uncertain scalability to larger areas, and lack of constraints such as land ownership and competition among multiple strategies. To address such challenges we have developed an integrated carbon and GHG budget model of California's entire landscape, delineated by geographic region, land type, and ownership. This empirical model has annual time steps and includes net ecosystem carbon exchange, wildfire, multiple forest management practices including wood and bioenergy production, cropland and rangeland soil management, various land type restoration activities, and land cover change. While the absolute estimates vary considerably due to uncertainties in initial carbon densities and ecosystem carbon exchange rates, the estimated effects of particular management activities with respect to baseline are robust across these uncertainties. Uncertainty in land use/cover change data is also critical, as different rates of shrubland to grassland conversion can switch the system from a carbon source to a sink. The results indicate that reducing urban area expansion has substantial and consistent benefits, while the effects of direct land management practices vary and depend largely on the available management area. Increasing forest fuel reduction extent over the baseline contributes to annual GHG costs during increased management, and annual benefits after increased management ceases. Cumulatively, it could take decades to recover the cost of 14 years of increased fuel reduction. However, forest carbon losses can be completely offset within 20 years through increases in urban forest fraction and marsh restoration. Additionally, highly uncertain black carbon estimates dominate the overall GHG budget due to wildfire, forest management, and bioenergy production. Overall, this tool is well suited for exploring suites of management options and extents throughout California in order to quantify potential regional carbon sequestration and GHG emission benefits.
NASA Technical Reports Server (NTRS)
Easterly, Jill
1993-01-01
This software package does ergonomic human modeling for maintenance tasks. Technician capabilities can be directed to represent actual situations of work environment, strengths and capabilities of the individual, particular limitations (such as constraining characteristics of a particular space suit), tools required, and procedures or tasks to be performed.
Onboard Monitoring and Reporting for Commercial Motor Vehicle Safety Final Report
DOT National Transportation Integrated Search
2008-02-01
This Final Report describes the process and product from the project, Onboard Monitoring and Reporting for Commercial Motor Vehicle Safety (OBMS), in which a prototypical suite of hardware and software on a class 8 truck was developed and tested. The...
Hiraki, Sakiko; Okada, Yohei; Arai, Yusuke; Ishii, Wataru; Iiduka, Ryoji
2017-08-01
Pulmonary sequestration is a congenital malformation characterized by nonfunctioning tissue not communicating with the tracheobronchial tree. As the blood pressure in the artery feeding the sequestrated lung tissue is higher than that in the normal pulmonary artery, the risk of massive hemorrhage in pulmonary sequestration is high. We herein present the first case of a severe blunt trauma patient with unstable pulmonary sequestration injury. The mechanism of pulmonary sequestration injury is vastly different than that of injury to normal lung. We suggest that proximal feeding artery embolization should be performed before surgical intervention in patients with massive hemorrhage of pulmonary sequestration due to severe chest trauma.
A custom multi-modal sensor suite and data analysis pipeline for aerial field phenotyping
NASA Astrophysics Data System (ADS)
Bartlett, Paul W.; Coblenz, Lauren; Sherwin, Gary; Stambler, Adam; van der Meer, Andries
2017-05-01
Our group has developed a custom, multi-modal sensor suite and data analysis pipeline to phenotype crops in the field using unpiloted aircraft systems (UAS). This approach to high-throughput field phenotyping is part of a research initiative intending to markedly accelerate the breeding process for refined energy sorghum varieties. To date, single rotor and multirotor helicopters, roughly 14 kg in total weight, are being employed to provide sensor coverage over multiple hectaresized fields in tens of minutes. The quick, autonomous operations allow for complete field coverage at consistent plant and lighting conditions, with low operating costs. The sensor suite collects data simultaneously from six sensors and registers it for fusion and analysis. High resolution color imagery targets color and geometric phenotypes, along with lidar measurements. Long-wave infrared imagery targets temperature phenomena and plant stress. Hyperspectral visible and near-infrared imagery targets phenotypes such as biomass and chlorophyll content, as well as novel, predictive spectral signatures. Onboard spectrometers and careful laboratory and in-field calibration techniques aim to increase the physical validity of the sensor data throughout and across growing seasons. Off-line processing of data creates basic products such as image maps and digital elevation models. Derived data products include phenotype charts, statistics, and trends. The outcome of this work is a set of commercially available phenotyping technologies, including sensor suites, a fully integrated phenotyping UAS, and data analysis software. Effort is also underway to transition these technologies to farm management users by way of streamlined, lower cost sensor packages and intuitive software interfaces.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sentis, Manuel Lorenzo; Gable, Carl W.
Furthermore, there are many applications in science and engineering modeling where an accurate representation of a complex model geometry in the form of a mesh is important. In applications of flow and transport in subsurface porous media, this is manifest in models that must capture complex geologic stratigraphy, structure (faults, folds, erosion, deposition) and infrastructure (tunnels, boreholes, excavations). Model setup, defined as the activities of geometry definition, mesh generation (creation, optimization, modification, refine, de-refine, smooth), assigning material properties, initial conditions and boundary conditions requires specialized software tools to automate and streamline the process. In addition, some model setup tools willmore » provide more utility if they are designed to interface with and meet the needs of a particular flow and transport software suite. A control volume discretization that uses a two point flux approximation is for example most accurate when the underlying control volumes are 2D or 3D Voronoi tessellations. In this paper we will present the coupling of LaGriT, a mesh generation and model setup software suite and TOUGH2 to model subsurface flow problems and we show an example of how LaGriT can be used as a model setup tool for the generation of a Voronoi mesh for the simulation program TOUGH2. To generate the MESH file for TOUGH2 from the LaGriT output a standalone module Lagrit2Tough2 was developed, which is presented here and will be included in a future release of LaGriT. Here in this paper an alternative method to generate a Voronoi mesh for TOUGH2 with LaGriT is presented and thanks to the modular and command based structure of LaGriT this method is well suited to generating a mesh for complex models.« less
Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)
NASA Astrophysics Data System (ADS)
Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David
2018-01-01
Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of various planetary targets.
Web-Enabled Systems for Student Access.
ERIC Educational Resources Information Center
Harris, Chad S.; Herring, Tom
1999-01-01
California State University, Fullerton is developing a suite of server-based, Web-enabled applications that distribute the functionality of its student information system software to external customers without modifying the mainframe applications or databases. The cost-effective, secure, and rapidly deployable business solution involves using the…
The Use of AMET and Automated Scripts for Model Evaluation
The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...
Development of an Aeromedical Scientific Information System for Aviation Safety
2008-01-01
math- ematics, engineering, computer hardware, software , and networking, was assembled to glean the most knowledge from the complicated aeromedical...9, SPlus Enterprise Developer 8, and Insightful Miner version 7. Process flow charts were done with SmartDraw Suite Edition version 7. Static and
Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike
2017-07-07
Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .
Tomar, Navneet; Mishra, Akhilesh; Mrinal, Nirotpal; Jayaram, B.
2016-01-01
Transcription factors (TFs) bind at multiple sites in the genome and regulate expression of many genes. Regulating TF binding in a gene specific manner remains a formidable challenge in drug discovery because the same binding motif may be present at multiple locations in the genome. Here, we present Onco-Regulon (http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm), an integrated database of regulatory motifs of cancer genes clubbed with Unique Sequence-Predictor (USP) a software suite that identifies unique sequences for each of these regulatory DNA motifs at the specified position in the genome. USP works by extending a given DNA motif, in 5′→3′, 3′ →5′ or both directions by adding one nucleotide at each step, and calculates the frequency of each extended motif in the genome by Frequency Counter programme. This step is iterated till the frequency of the extended motif becomes unity in the genome. Thus, for each given motif, we get three possible unique sequences. Closest Sequence Finder program predicts off-target drug binding in the genome. Inclusion of DNA-Protein structural information further makes Onco-Regulon a highly informative repository for gene specific drug development. We believe that Onco-Regulon will help researchers to design drugs which will bind to an exclusive site in the genome with no off-target effects, theoretically. Database URL: http://www.scfbio-iitd.res.in/software/onco/NavSite/index.htm PMID:27515825
NASA Astrophysics Data System (ADS)
Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.
2014-12-01
Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.
Importance and effects of altered workplace ergonomics in modern radiology suites.
Harisinghani, Mukesh G; Blake, Michael A; Saksena, Mansi; Hahn, Peter F; Gervais, Debra; Zalis, Michael; da Silva Dias Fernandes, Leonor; Mueller, Peter R
2004-01-01
The transition from a film-based to a filmless soft-copy picture archiving and communication system (PACS)-based environment has resulted in improved work flow as well as increased productivity, diagnostic accuracy, and job satisfaction. Adapting to this filmless environment in an efficient manner requires seamless integration of various components such as PACS workstations, the Internet and hospital intranet, speech recognition software, paperless electronic hospital medical records, e-mail, office software, and telecommunications. However, the importance of optimizing workplace ergonomics has received little attention. Factors such as the position of the work chair, workstation table, keyboard, mouse, and monitors, along with monitor refresh rates and ambient room lighting, have become secondary considerations. Paying close attention to the basics of workplace ergonomics can go a long way in increasing productivity and reducing fatigue, thus allowing full realization of the potential benefits of a PACS. Optimization of workplace ergonomics should be considered in the basic design of any modern radiology suite. Copyright RSNA, 2004
Engineering Software Suite Validates System Design
NASA Technical Reports Server (NTRS)
2007-01-01
EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers
DOT National Transportation Integrated Search
2010-05-01
The Federal Highway Administration (FHWA) established the Carbon Sequestration Pilot Program (CSPP) in 2008 to assess whether a roadside carbon sequestration effort through modified maintenance and management practices is appropriate and feasible for...
Software Schedules Missions, Aids Project Management
NASA Technical Reports Server (NTRS)
2008-01-01
NASA missions require advanced planning, scheduling, and management, and the Space Agency has worked extensively to develop the programs and software suites necessary to facilitate these complex missions. These enormously intricate undertakings have hundreds of active components that need constant management and monitoring. It is no surprise, then, that the software developed for these tasks is often applicable in other high-stress, complex environments, like in government or industrial settings. NASA work over the past few years has resulted in a handful of new scheduling, knowledge-management, and research tools developed under contract with one of NASA s partners. These tools have the unique responsibility of supporting NASA missions, but they are also finding uses outside of the Space Program.
SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology
Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E.; Troein, Carl; Millar, Andrew J.; Goryanin, Igor; Gilmore, Stephen
2013-01-01
Summary: Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI’s use of standard data formats. Availability and implementation: All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials. Contact: stg@inf.ed.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23329415
Web-Based Real-Time Emergency Monitoring
NASA Technical Reports Server (NTRS)
Harvey, Craig A.; Lawhead, Joel
2007-01-01
The Web-based Real-Time Asset Monitoring (RAM) module for emergency operations and facility management enables emergency personnel in federal agencies and local and state governments to monitor and analyze data in the event of a natural disaster or other crisis that threatens a large number of people and property. The software can manage many disparate sources of data within a facility, city, or county. It was developed on industry-standard Geo- Spatial software and is compliant with open GIS standards. RAM View can function as a standalone system, or as an integrated plugin module to Emergency Operations Center (EOC) software suites such as REACT (Real-time Emergency Action Coordination Tool), thus ensuring the widest possible distribution among potential users. RAM has the ability to monitor various data sources, including streaming data. Many disparate systems are included in the initial suite of supported hardware systems, such as mobile GPS units, ambient measurements of temperature, moisture and chemical agents, flow meters, air quality, asset location, and meteorological conditions. RAM View displays real-time data streams such as gauge heights from the U.S. Geological Survey gauging stations, flood crests from the National Weather Service, and meteorological data from numerous sources. Data points are clearly visible on the map interface, and attributes as specified in the user requirements can be viewed and queried.
Surface and borehole neutron probes for the Construction and Resource Utilization eXplorer (CRUX)
NASA Technical Reports Server (NTRS)
Elphic, Richard C.; Hahn, Sangkoo; Lawrence, David J.; Feldman, William C.; Johnson, Jerome B.; Haldemann, Albert F. C.
2006-01-01
The Construction and Resource Utilization eXplorer (CRUX) project aims to develop an integrated, flexible suite of instruments with data fusion software and an executive controller for the purpose of in situ resource assessment and characterization for future space exploration.
UNCERTAINTY AND THE JOHNSON-ETTINGER MODEL FOR VAPOR INTRUSION CALCULATIONS
The Johnson-Ettinger Model is widely used for assessing the impacts of contaminated vapors on residential air quality. Typical use of this model relies on a suite of estimated data, with few site-specific measurements. Software was developed to provide the public with automate...
Software Tools for Weed Seed Germination Modeling
USDA-ARS?s Scientific Manuscript database
The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grigg, Reid; McPherson, Brian; Lee, Rober
The Southwest Regional Partnership on Carbon Sequestration (SWP) one of seven regional partnerships sponsored by the U.S. Department of Energy (USDOE) carried out five field pilot tests in its Phase II Carbon Sequestration Demonstration effort, to validate the most promising sequestration technologies and infrastructure concepts, including three geologic pilot tests and two terrestrial pilot programs. This field testing demonstrated the efficacy of proposed sequestration technologies to reduce or offset greenhouse gas emissions in the region. Risk mitigation, optimization of monitoring, verification, and accounting (MVA) protocols, and effective outreach and communication were additional critical goals of these field validation tests. Themore » program included geologic pilot tests located in Utah, New Mexico, Texas, and a region-wide terrestrial analysis. Each geologic sequestration test site was intended to include injection of a minimum of ~75,000 tons/year CO{sub 2}, with minimum injection duration of one year. These pilots represent medium- scale validation tests in sinks that host capacity for possible larger-scale sequestration operations in the future. These validation tests also demonstrated a broad variety of carbon sink targets and multiple value-added benefits, including testing of enhanced oil recovery and sequestration, enhanced coalbed methane production and a geologic sequestration test combined with a local terrestrial sequestration pilot. A regional terrestrial sequestration demonstration was also carried out, with a focus on improved terrestrial MVA methods and reporting approaches specific for the Southwest region.« less
Formal Verification Toolkit for Requirements and Early Design Stages
NASA Technical Reports Server (NTRS)
Badger, Julia M.; Miller, Sheena Judson
2011-01-01
Efficient flight software development from natural language requirements needs an effective way to test designs earlier in the software design cycle. A method to automatically derive logical safety constraints and the design state space from natural language requirements is described. The constraints can then be checked using a logical consistency checker and also be used in a symbolic model checker to verify the early design of the system. This method was used to verify a hybrid control design for the suit ports on NASA Johnson Space Center's Space Exploration Vehicle against safety requirements.
Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.
2010-01-01
Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475
Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E
2016-09-01
Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.
Calypso: a user-friendly web-server for mining and visualizing microbiome-environment interactions.
Zakrzewski, Martha; Proietti, Carla; Ellis, Jonathan J; Hasan, Shihab; Brion, Marie-Jo; Berger, Bernard; Krause, Lutz
2017-03-01
Calypso is an easy-to-use online software suite that allows non-expert users to mine, interpret and compare taxonomic information from metagenomic or 16S rDNA datasets. Calypso has a focus on multivariate statistical approaches that can identify complex environment-microbiome associations. The software enables quantitative visualizations, statistical testing, multivariate analysis, supervised learning, factor analysis, multivariable regression, network analysis and diversity estimates. Comprehensive help pages, tutorials and videos are provided via a wiki page. The web-interface is accessible via http://cgenome.net/calypso/ . The software is programmed in Java, PERL and R and the source code is available from Zenodo ( https://zenodo.org/record/50931 ). The software is freely available for non-commercial users. l.krause@uq.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Development of a customizable software application for medical imaging analysis and visualization.
Martinez-Escobar, Marisol; Peloquin, Catherine; Juhnke, Bethany; Peddicord, Joanna; Jose, Sonia; Noon, Christian; Foo, Jung Leng; Winer, Eliot
2011-01-01
Graphics technology has extended medical imaging tools to the hands of surgeons and doctors, beyond the radiology suite. However, a common issue in most medical imaging software is the added complexity for non-radiologists. This paper presents the development of a unique software toolset that is highly customizable and targeted at the general physicians as well as the medical specialists. The core functionality includes features such as viewing medical images in two-and three-dimensional representations, clipping, tissue windowing, and coloring. Additional features can be loaded in the form of 'plug-ins' such as tumor segmentation, tissue deformation, and surgical planning. This allows the software to be lightweight and easy to use while still giving the user the flexibility of adding the necessary features, thus catering to a wide range of user population.
78 FR 10003 - Proposed Collection; Comment Request for Notice 2009-XX (NOT-151370-08)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-12
... comments concerning Notice 2009-XX, Credit for Carbon Dioxide Sequestration under Section 45Q. [email protected] . SUPPLEMENTARY INFORMATION: Title: Credit for Carbon Dioxide Sequestration under Section... carbon dioxide sequestration (CO 2 sequestration credit) under Sec. 45Q of the Internal Revenue Code...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.; McCorkle, D.; Yang, C.
Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less
BARTTest: Community-Standard Atmospheric Radiative-Transfer and Retrieval Tests
NASA Astrophysics Data System (ADS)
Harrington, Joseph; Himes, Michael D.; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.
2018-01-01
Atmospheric radiative transfer (RT) codes are used both to predict planetary and brown-dwarf spectra and in retrieval algorithms to infer atmospheric chemistry, clouds, and thermal structure from observations. Observational plans, theoretical models, and scientific results depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. The community needs a suite of test calculations with analytically, numerically, or at least community-verified results. We therefore present the Bayesian Atmospheric Radiative Transfer Test Suite, or BARTTest. BARTTest has four categories of tests: analytically verified RT tests of simple atmospheres (single line in single layer, line blends, saturation, isothermal, multiple line-list combination, etc.), community-verified RT tests of complex atmospheres, synthetic retrieval tests on simulated data with known answers, and community-verified real-data retrieval tests.BARTTest is open-source software intended for community use and further development. It is available at https://github.com/ExOSPORTS/BARTTest. We propose this test suite as a standard for verifying atmospheric RT and retrieval codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G, NASA Astrophysics Data Analysis Program grant NNX13AF38G, and NASA Exoplanets Research Program grant NNX17AB62G.
Electrical Resistance Tomography Field Trials to Image CO2 Sequestration
NASA Astrophysics Data System (ADS)
Newmark, R.
2003-12-01
If geologic formations are used to sequester or store carbon dioxide (CO2) for long periods of time, it will be necessary to verify the containment of injected CO2 by assessing leaks and flow paths, and by understanding the geophysical and geochemical interactions between the CO2 and the geologic minerals and fluids. Remote monitoring methods are preferred, to minimize cost and impact to the integrity of the disposal reservoir. Electrical methods are especially well suited for monitoring processes involving fluids, as electrical properties are most sensitive to the presence and nature of the fluids contained in the medium. High resolution tomographs of electrical properties have been used with success for site characterization, monitoring subsurface migration of fluids in instances of leaking underground tanks, water infiltration events, subsurface steam floods, contaminant movement, and assessing the integrity of subsurface barriers. These surveys are commonly conducted utilizing vertical arrays of point electrodes in a crosswell configuration. Alternative ways of monitoring the reservoir are desirable due to the high costs of drilling the required monitoring boreholes Recent field results obtained using steel well casings as long electrodes are also promising. We have conducted field trials to evaluate the effectiveness of long electrode ERT as a potential monitoring approach for CO2 sequestration. In these trials, CO2 is not being sequestered but rather is being used as a solvent for enhanced oil recovery. This setting offers the same conditions expected during sequestration so monitoring secondary oil recovery allows a test of the method under realistic physical conditions and operational constraints. Field experience has confirmed the challenges identified during model studies. The principal difficulty are the very small signals due to the fact that formation changes occur only over a small segment of the 5000 foot length of the electrodes. In addition, telluric noise can be comparable to the signal levels during periods of geomagnetic activity. Finally, instrumentation stability over long periods is necessary to follow trends in reservoir behavior for several years. Solutions to these and other problems will be presented along with results from the first two years of work at a producing field undergoing CO2 flood. If electrical resistance tomography (ERT) imaging can be performed using existing well casings as long electrodes, it will substantially reduce the cost to monitor CO2 sequestration. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.
Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vivek Agarwal; Nancy J. Lybeck; Randall Bickford
Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation ofmore » the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.« less
NASGRO 3.0: A Software for Analyzing Aging Aircraft
NASA Technical Reports Server (NTRS)
Mettu, S. R.; Shivakumar, V.; Beek, J. M.; Yeh, F.; Williams, L. C.; Forman, R. G.; McMahon, J. J.; Newman, J. C., Jr.
1999-01-01
Structural integrity analysis of aging aircraft is a critical necessity in view of the increasing numbers of such aircraft in general aviation, the airlines and the military. Efforts are in progress by NASA, the FAA and the DoD to focus attention on aging aircraft safety. The present paper describes the NASGRO software which is well-suited for effectively analyzing the behavior of defects that may be found in aging aircraft. The newly revised Version 3.0 has many features specifically implemented to suit the needs of the aircraft community. The fatigue crack growth computer program NASA/FLAGRO 2.0 was originally developed to analyze space hardware such as the Space Shuttle, the International Space Station and the associated payloads. Due to popular demand, the software was enhanced to suit the needs of the aircraft industry. Major improvements in Version 3.0 are the incorporation of the ability to read aircraft spectra of unlimited size, generation of common aircraft fatigue load blocks, and the incorporation of crack-growth models which include load-interaction effects such as retardation due to overloads and acceleration due to underloads. Five new crack-growth models, viz., generalized Willenborg, modified generalized Willenborg, constant closure model, Walker-Chang model and the deKoning-Newman strip-yield model, have been implemented. To facilitate easier input of geometry, material properties and load spectra, a Windows-style graphical user interface has been developed. Features to quickly change the input and rerun the problem as well as examine the output are incorporated. NASGRO has been organized into three modules, the crack-growth module being the primary one. The other two modules are the boundary element module and the material properties module. The boundary-element module provides the ability to model and analyze complex two-dimensional problems to obtain stresses and stress-intensity factors. The material properties module allows users to store and curve-fit fatigue-crack growth data. On-line help and documentation are provided for each of the modules. In addition to the popular PC windows version, a unix-based X-windows version of NASGRO is also available. A portable C++ class library called WxWindows was used to facilitate cross-platform availability of the software.
PH5: HDF5 Based Format for Integrating and Archiving Seismic Data
NASA Astrophysics Data System (ADS)
Hess, D.; Azevedo, S.; Falco, N.; Beaudoin, B. C.
2017-12-01
PH5 is a seismic data format created by IRIS PASSCAL using HDF5. Building PH5 on HDF5 allows for portability and extensibility on a scale that is unavailable in older seismic data formats. PH5 is designed to evolve to accept new data types as they become available in the future and to operate on a variety of platforms (i.e. Mac, Linux, Windows). Exemplifying PH5's flexibility is the evolution from just handling active source seismic data to now including passive source, onshore-offshore, OBS and mixed source seismic data sets. In PH5, metadata is separated from the time series data and stored in a size and performance efficient manner that also allows for easy user interaction and output of the metadata in a format appropriate for the data set. PH5's full-fledged "Kitchen Software Suite" comprises tools for data ingestion (e.g. RefTek, SEG-Y, SEG-D, SEG-2, MSEED), meta-data management, QC, waveform viewing, and data output. This software suite not only includes command line and GUI tools for interacting with PH5, it is also a comprehensive Python package to support the creation of software tools by the community to further enhance PH5. The PH5 software suite is currently being used in multiple capacities, including in-field for creating archive ready data sets as well as by the IRIS Data Management Center (DMC) to offer an FDSN compliant set of web services for serving PH5 data to the community in a variety of standard data and meta-data formats (i.e. StationXML, QuakeML, EventXML, SAC + Poles and Zeroes, MiniSEED, and SEG-Y) as well as StationTXT and ShotText formats. These web services can be accessed via standard FDSN clients such as ObsPy, irisFetch.m, FetchData, and FetchMetadata. This presentation will highlight and demonstrate the benefits of PH5 as a next generation adaptable and extensible data format for use in both archiving and working with seismic data.
Scenario Educational Software: Design and Development of Discovery Learning.
ERIC Educational Resources Information Center
Keegan, Mark
This book shows how and why the computer is so well suited to producing discovery learning environments. An examination of the literature outlines four basic modes of instruction: didactic, Socratic, inquiry, and discovery. Research from the fields of education, psychology, and physiology is presented to demonstrate the many strengths of…
Open Source, Meet "User-Generated Science"
ERIC Educational Resources Information Center
Huwe, Terence K.
2009-01-01
This article discusses Research Blogging, a community-run nonprofit organization that is promoting a suite of blogging software to scholars. Research Blogging itself does two things. First, it extends an invitation to a community, and it is open to anyone. Second, it requires its users to follow guidelines. The combination of rigorous guidelines…
USDA-ARS?s Scientific Manuscript database
We describe a suite of software tools for identifying possible functional changes in gene structure that may result from sequence variants. ACE (“Assessing Changes to Exons”) converts phased genotype calls to a collection of explicit haplotype sequences, maps transcript annotations onto them, detect...
Multimedia Madness: Creating with a Purpose
ERIC Educational Resources Information Center
Bodley, Barb; Bremer, Janet
2004-01-01
High school students working in a project-driven environment create "projects with a purpose" that give younger students technology-based activities to help them practice skills in reading, math, spelling and science. An elective semester-long course using the Macromedia suite of programs with the objective of learning the software skills of…
A Management Information System for Bare Base Civil Engineering Commanders
1988-09-01
initial beddown stage. The purpose of this research was to determine the feasibility of developing a microcomputer based management information system (MIS...the software best suited to synthesize four of the categories into a prototype field MIS. Keyword: Management information system , Bare bases, Civil engineering, Data bases, Information retrieval.
Status and potential of terrestrial carbon sequestration in West Virginia
Benktesh D. Sharma; Jingxin Wang
2011-01-01
Terrestrial ecosystem management offers cost-effective ways to enhance carbon (C) sequestration. This study utilized C stock and C sequestration in forest and agricultural lands, abandoned mine lands, and harvested wood products to estimate the net current annual C sequestration in West Virginia. Several management options within these components were simulated using a...
Simulating carbon sequestration using cellular automata and land use assessment for Karaj, Iran
NASA Astrophysics Data System (ADS)
Khatibi, Ali; Pourebrahim, Sharareh; Mokhtar, Mazlin Bin
2018-06-01
Carbon sequestration has been proposed as a means of slowing the atmospheric and marine accumulation of greenhouse gases. This study used observed and simulated land use/cover changes to investigate and predict carbon sequestration rates in the city of Karaj. Karaj, a metropolis of Iran, has undergone rapid population expansion and associated changes in recent years, and these changes make it suitable for use as a case study for rapidly expanding urban areas. In particular, high quality agricultural space, green space and gardens have rapidly transformed into industrial, residential and urban service areas. Five classes of land use/cover (residential, agricultural, rangeland, forest and barren areas) were considered in the study; vegetation and soil samples were taken from 20 randomly selected locations. The level of carbon sequestration was determined for the vegetation samples by calculating the amount of organic carbon present using the dry plant weight method, and for soil samples by using the method of Walkley and Black. For each area class, average values of carbon sequestration in vegetation and soil samples were calculated to give a carbon sequestration index
. A cellular automata approach was used to simulate changes in the classes. Finally, the carbon sequestration indices were combined with simulation results to calculate changes in carbon sequestration for each class. It is predicted that, in the 15 year period from 2014 to 2029, much agricultural land will be transformed into residential land, resulting in a severe reduction in the level of carbon sequestration. Results from this study indicate that expansion of forest areas in urban counties would be an effective means of increasing the levels of carbon sequestration. Finally, future opportunities to include carbon sequestration into the simulation of land use/cover changes are outlined.
Oliver, Edward R; DeBari, Suzanne E; Giannone, Mariann M; Pogoriler, Jennifer E; Johnson, Ann M; Horii, Steven C; Gebb, Juliana S; Howell, Lori J; Adzick, N Scott; Coleman, Beverly G
2018-02-01
To assess the ability of prenatal ultrasound (US) in identifying systemic feeding arteries in bronchopulmonary sequestrations and hybrid lesions and report the ability of US in classifying bronchopulmonary sequestrations as intralobar or extralobar. Institutional Review Board-approved radiology and clinical database searches from 2008 to 2015 were performed for prenatal lung lesions with final diagnoses of bronchopulmonary sequestrations or hybrid lesions. All patients had detailed US examinations, and most patients had ultrafast magnetic resonance imaging (MRI). Lesion location, size, and identification of systemic feeding arteries and draining veins were assessed with US. The study consisted of 102 bronchopulmonary sequestrations and 86 hybrid lesions. The median maternal age was 30 years. The median gestational age was 22 weeks 5 days. Of bronchopulmonary sequestrations, 66 had surgical pathologic confirmation, and 100 had postnatal imaging. Bronchopulmonary sequestration locations were intrathoracic (n = 77), intra-abdominal (n = 19), and transdiaphragmatic (n = 6). Of hybrid lesions, 84 had surgical pathologic confirmation, and 83 had postnatal imaging. Hybrid lesion locations were intrathoracic (n = 84) and transdiaphragmatic (n = 2). Ultrasound correctly identified systemic feeding arteries in 86 of 102 bronchopulmonary sequestrations and 79 of 86 hybrid lesions. Of patients who underwent MRI, systemic feeding arteries were reported in 62 of 92 bronchopulmonary sequestrations and 56 of 81 hybrid lesions. Ultrasound identified more systemic feeding arteries than MRI in both bronchopulmonary sequestrations and hybrid lesions (P < .01). Magnetic resonance imaging identified systemic feeding arteries that US did not in only 2 cases. In cases in which both systemic feeding arteries and draining veins were identified, US could correctly predict intrathoracic lesions as intralobar or extralobar in 44 of 49 bronchopulmonary sequestrations and 68 of 73 hybrid lesions. Ultrasound is most accurate for systemic feeding artery detection in bronchopulmonary sequestrations and hybrid lesions and can also type the lesions as intralobar or extralobar when draining veins are evaluated. © 2017 by the American Institute of Ultrasound in Medicine.
Energy efficiency and reduction of CO2 emissions from campsites management in a protected area.
Del Moretto, Deny; Branca, Teresa Annunziata; Colla, Valentina
2018-09-15
Campsites can be a pollution source, mainly due to the energy consumption. In addition, the green areas, thanks to the direct CO 2 sequestration and the shading, indirectly prevent the CO 2 emissions related to energy consumption. The methodology presented in this paper allowed assessing the annual CO 2 emissions directly related to the campsite management and the consequent environmental impact in campsite clusters in Tuscany. The software i-Tree Canopy was exploited, enabling to evaluate in terms of "canopy" the tonnes of CO 2 sequestered by the vegetation within each campsite. Energy and water consumptions from 2012 to 2015 were assessed for each campsite. As far as the distribution of sequestered CO 2 is concerned, the campsites ranking was in accordance to their size. According to the indicator "T-Tree" or canopy cover, a larger area of the canopy cover allows using less outdoor areas covered by trees for the sequestration of the remaining amount of pollutants. The analysis shows that the considered campsites, that are located in a highly naturalistic Park, present significant positive aspects both in terms of CO 2 emission reductions and of energy efficiency. However, significant margins of improvement are also possible and they were analysed in the paper. Copyright © 2018 Elsevier Ltd. All rights reserved.
APPLICATION AND DEVELOPMENT OF APPROPRIATE TOOLS AND TECHNOLOGIES FOR COST-EFFECTIVE CARBON
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Sandra Brown; Ellen Hawes
2003-09-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas impacts. The research described in this report occurred between July 1, 2002 and June 30, 2003. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: advanced videography testing; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
A self-referential HOWTO on release engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galassi, Mark C.
Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early andmore » continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.« less
NASA Astrophysics Data System (ADS)
Bellerive, Nathalie
The research project hypothesis is that CO2 capture and sequestration technologies (CSC) leads to a significant decrease in global warming, but increases the impact of all other aspects of the study. This is because other processes used for CO2 capture and sequestration require additional quantities of raw materials and energy. Two other objectives are described in this project. The first is the modeling of an Integrated Gasification Combined Cycle power plant for which there is no known generic data. The second is to select the right hypothesis regarding electrical production technologies, CO2 capture, compression and transportation by pipeline and finally sequestration. "Life Cycle Assessment" (LCA) analyses were chosen for this research project. LCA is an exhaustive quantitative method used to evaluate potential environmental impacts associated with a product, a service or an activity from resource extraction to waste elimination. This tool is governed by ISO 14 040 through ISO 14 049 and is sustained by the Society of Environmental Toxicology and Chemistry (SETAC) and the United Nations Environment Program (UNEP). Two power plants were studied, the Integrated Gasification Combined Cycle (IGCC) power plant and the Natural Gas Combined Cycle (NGCC) power plant. In order to sequester CO2 in geological formation, it is necessary to extract CO2from emission flows. For the IGCC power plant, CO 2 was captured before the burning phase. For the NGCC power plant, the capture was done during the afterburning phase. Once the CO2 was isolated, it was compressed and directed through a transportation pipe 1 000 km in length on the ground surface and in the sea. It is hypothesized that the power plant is 300 km from the shore and the sequestration platform 700 km from France's shore, in the North Sea. The IGCC power plant modeling and data selection regarding CO2 capture and sequestration were done by using primary data from the industry and the Ecoinvent generic database (Version 1.2). This database was selected due to its European source. Finally, technical calculations and literature were used to complete the data inventory. This was validated by electrical experts in order to increase data and modeling precision. Results were similar for IGCC and NGCC power plants using Impact 2002+, an impacts analysis method. Global warming potential decreased by 67% with the implementation of CO2 capture and sequestration compared to systems without CSC. Results for all others impacts categories, demonstrated an increase from 16% to 116% in relative proportions compared to systems without CSC. The main contributor was the additional quantity of energy required to operate CO2 capture and compression facilities. This additional energy negatively affected the power plant's global efficiency because of the increase in the quantity of fossil fuel that needed to be extracted and consumed. The increase in other impacts was mainly due to additional electricity, fossil fuel (for extracting, treatment and transportation) and additional emissions generated during power plant operations. A scenario analysis was done to study the sensitivity and variability of uncertain data during the software modeling process of a power plant. Data on power plant efficiency is the most variable and sensitive during modeling, followed by the length of the transportation pipe and the leaking rate during CO2 sequestration. This result analysis is interesting because it led to the maximum efficiency scenario with capture (with a short CO 2 transportation distance and a low leaking rate) obtaining better results on all impact category indicators, compared to the minimum efficiency scenario without capture. In fact, positive results on all category indicators were possible during the system comparison between the two cases (with and without capture). (Abstract shortened by UMI.)
Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard Strydom
2010-06-01
The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of themore » implementation and testing of SUSA at the INL VHTR Project Office.« less
Policy Process Editor for P3BM Software
NASA Technical Reports Server (NTRS)
James, Mark; Chang, Hsin-Ping; Chow, Edward T.; Crichton, Gerald A.
2010-01-01
A computer program enables generation, in the form of graphical representations of process flows with embedded natural-language policy statements, input to a suite of policy-, process-, and performance-based management (P3BM) software. This program (1) serves as an interface between users and the Hunter software, which translates the input into machine-readable form; and (2) enables users to initialize and monitor the policy-implementation process. This program provides an intuitive graphical interface for incorporating natural-language policy statements into business-process flow diagrams. Thus, the program enables users who dictate policies to intuitively embed their intended process flows as they state the policies, reducing the likelihood of errors and reducing the time between declaration and execution of policy.
Carpeggiani, Clara; Paterni, Marco; Caramella, Davide; Vano, Eliseo; Semelka, Richard C; Picano, Eugenio
2012-11-01
Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient' s cumulative lifetime radiation exposure would facilitate provider-patient communication. The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. We developed a novel software program (PC-platform, Windows OS fully downloadable at http://suit-heart.ifc.cnr.it) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Brady, J. J.; Tweedie, C. E.; Escapita, I. J.
2009-12-01
There is a fundamental need to improve capacities for monitoring environmental change using remote sensing technologies. Recently, researchers have begun using Unmanned Aerial Vehicles (UAVs) to expand and improve upon remote sensing capabilities. Limitations to most non-military and relatively small-scale Unmanned Aircraft Systems (UASs) include a need to develop more reliable communications between ground and aircraft, tools to optimize flight control, real time data processing, and visually ascertaining the quantity of data collected while in air. Here we present a prototype software system that has enhanced communication between ground and the vehicle, can synthesize near real time data acquired from sensors on board, can log operation data during flights, and can visually demonstrate the amount and quality of data for a sampling area. This software has the capacity to greatly improve the utilization of UAS in the environmental sciences. The software system is being designed for use on a paraglider UAV that has a suite of sensors suitable for characterizing the footprints of eddy covariance towers situated in the Chihuahuan Desert and in the Arctic. Sensors on board relay operational flight data (airspeed, ground speed, latitude, longitude, pitch, yaw, roll, acceleration, and video) as well as a suite of customized sensors. Additional sensors can be added to an on board laptop or a CR1000 data logger thereby allowing data from these sensors to be visualized in the prototype software. This poster will describe the development, use and customization of our UAS and multimedia will be available during AGU to illustrate the system in use. UAV on workbench in the lab UAV in flight
100% Solids Polyurethane Sequestration Coating
2014-04-11
Distribution Unlimited 100% Solids Polyurethane Sequestration Coating The views, opinions and/or findings contained in this report are those of the...Papers published in non peer-reviewed journals: 100% Solids Polyurethane Sequestration Coating Report Title Report developed under Topic #CBD13-101...Final Technical Report Contract #: W911NF-13-P-0010 Proposal #: 63958CHSB1 Project: 100% Solids Polyurethane Sequestration Coating
Calibrating LOFAR using the Black Board Selfcal System
NASA Astrophysics Data System (ADS)
Pandey, V. N.; van Zwieten, J. E.; de Bruyn, A. G.; Nijboer, R.
2009-09-01
The Black Board SelfCal (BBS) system is designed as the final processing system to carry out the calibration of LOFAR in an efficient way. In this paper we give a brief description of its architectural and software design including its distributed computing approach. A confusion limited deep all sky image (from 38-62 MHz) by calibrating LOFAR test data with the BBS suite is shown as a sample result. The present status and future directions of development of BBS suite are also touched upon. Although BBS is mainly developed for LOFAR, it may also be used to calibrate other instruments once their specific algorithms are plugged in.
The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.
Adolf-Bryfogle, Jared; Dunbrack, Roland L
2013-01-01
The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.
A discontinuous Galerkin method for gravity-driven viscous fingering instabilities in porous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scovazzi, G.; Gerstenberger, A.; Collis, S. S.
2013-01-01
We present a new approach to the simulation of gravity-driven viscous fingering instabilities in porous media flow. These instabilities play a very important role during carbon sequestration processes in brine aquifers. Our approach is based on a nonlinear implementation of the discontinuous Galerkin method, and possesses a number of key features. First, the method developed is inherently high order, and is therefore well suited to study unstable flow mechanisms. Secondly, it maintains high-order accuracy on completely unstructured meshes. The combination of these two features makes it a very appealing strategy in simulating the challenging flow patterns and very complex geometriesmore » of actual reservoirs and aquifers. This article includes an extensive set of verification studies on the stability and accuracy of the method, and also features a number of computations with unstructured grids and non-standard geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. D. White; B. P. McGrail; S. K. Wurstner
Displacing natural gas and petroleum with carbon dioxide is a proven technology for producing conventional geologic hydrocarbon reservoirs, and producing additional yields from abandoned or partially produced petroleum reservoirs. Extending this concept to natural gas hydrate production offers the potential to enhance gas hydrate recovery with concomitant permanent geologic sequestration. Numerical simulation was used to assess a suite of carbon dioxide injection techniques for producing gas hydrates from a variety of geologic deposit types. Secondary hydrate formation was found to inhibit contact of the injected CO{sub 2} regardless of injectate phase state, thus diminishing the exchange rate due to poremore » clogging and hydrate zone bypass of the injected fluids. Additional work is needed to develop methods of artificially introducing high-permeability pathways in gas hydrate zones if injection of CO{sub 2} in either gas, liquid, or micro-emulsion form is to be more effective in enhancing gas hydrate production rates.« less
Altszyler, Edgar; Ventura, Alejandra C; Colman-Lerner, Alejandro; Chernomoretz, Ariel
2017-01-01
Ultrasensitive response motifs, capable of converting graded stimuli into binary responses, are well-conserved in signal transduction networks. Although it has been shown that a cascade arrangement of multiple ultrasensitive modules can enhance the system's ultrasensitivity, how a given combination of layers affects a cascade's ultrasensitivity remains an open question for the general case. Here, we introduce a methodology that allows us to determine the presence of sequestration effects and to quantify the relative contribution of each module to the overall cascade's ultrasensitivity. The proposed analysis framework provides a natural link between global and local ultrasensitivity descriptors and it is particularly well-suited to characterize and understand mathematical models used to study real biological systems. As a case study, we have considered three mathematical models introduced by O'Shaughnessy et al. to study a tunable synthetic MAPK cascade, and we show how our methodology can help modelers better understand alternative models.
Altszyler, Edgar; Ventura, Alejandra C.; Colman-Lerner, Alejandro; Chernomoretz, Ariel
2017-01-01
Ultrasensitive response motifs, capable of converting graded stimuli into binary responses, are well-conserved in signal transduction networks. Although it has been shown that a cascade arrangement of multiple ultrasensitive modules can enhance the system’s ultrasensitivity, how a given combination of layers affects a cascade’s ultrasensitivity remains an open question for the general case. Here, we introduce a methodology that allows us to determine the presence of sequestration effects and to quantify the relative contribution of each module to the overall cascade’s ultrasensitivity. The proposed analysis framework provides a natural link between global and local ultrasensitivity descriptors and it is particularly well-suited to characterize and understand mathematical models used to study real biological systems. As a case study, we have considered three mathematical models introduced by O’Shaughnessy et al. to study a tunable synthetic MAPK cascade, and we show how our methodology can help modelers better understand alternative models. PMID:28662096
NASA Astrophysics Data System (ADS)
Laracuente, Nicholas; Grossman, Carl
2013-03-01
We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College
Spectral Characterization of Analog Samples in Anticipation of OSIRIS-REx's Arrival at Bennu
NASA Technical Reports Server (NTRS)
Donaldson Hanna, K. L.; Schrader, D. L.; Bowles, N. E.; Clark, B. E.; Cloutis, E. A.; Connolly, H. C., Jr.; Hamilton, V. E.; Keller, L. P.; Lauretta, D. S.; Lim, L. F.;
2017-01-01
NASA's Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) mission successfully launched on September 8th, 2016. During its rendezvous with near-Earth asteroid (101955) Bennu beginning in 2018, OSIRIS-REx will characterize the asteroid's physical, mineralogical, and chemical properties in an effort to globally map the properties of Bennu, a primitive carbonaceous asteroid, and choose a sampling location [e.g. 1]. In preparation for these observations, we spectrally characterized a suite of analog samples across visible, near- and thermal-infrared wavelengths and used these in initial tests of phase detection and abundance determination software algorithms. Here we present the thermal infrared laboratory measurements of the analog sample suite measured under asteroidlike conditions, which are relevant to the interpretation of spectroscopic observations by the OSIRIS-REx Thermal Emission Spectrometer (OTES) [2, 3]. This suite of laboratory measurements of asteroid analogs under asteroid-like conditions is the first of their kind.
A Survey of UML Based Regression Testing
NASA Astrophysics Data System (ADS)
Fahad, Muhammad; Nadeem, Aamer
Regression testing is the process of ensuring software quality by analyzing whether changed parts behave as intended, and unchanged parts are not affected by the modifications. Since it is a costly process, a lot of techniques are proposed in the research literature that suggest testers how to build regression test suite from existing test suite with minimum cost. In this paper, we discuss the advantages and drawbacks of using UML diagrams for regression testing and analyze that UML model helps in identifying changes for regression test selection effectively. We survey the existing UML based regression testing techniques and provide an analysis matrix to give a quick insight into prominent features of the literature work. We discuss the open research issues like managing and reducing the size of regression test suite, prioritization of the test cases that would be helpful during strict schedule and resources that remain to be addressed for UML based regression testing.
NASA Technical Reports Server (NTRS)
Rice, J. Kevin
2013-01-01
The XTCE GOVSAT software suite contains three tools: validation, search, and reporting. The Extensible Markup Language (XML) Telemetric and Command Exchange (XTCE) GOVSAT Tool Suite is written in Java for manipulating XTCE XML files. XTCE is a Consultative Committee for Space Data Systems (CCSDS) and Object Management Group (OMG) specification for describing the format and information in telemetry and command packet streams. These descriptions are files that are used to configure real-time telemetry and command systems for mission operations. XTCE s purpose is to exchange database information between different systems. XTCE GOVSAT consists of rules for narrowing the use of XTCE for missions. The Validation Tool is used to syntax check GOVSAT XML files. The Search Tool is used to search (i.e. command and telemetry mnemonics) the GOVSAT XML files and view the results. Finally, the Reporting Tool is used to create command and telemetry reports. These reports can be displayed or printed for use by the operations team.
NASA Astrophysics Data System (ADS)
Burba, G. G.; Madsen, R.; Feese, K.
2013-12-01
The eddy covariance (EC) method is a micrometeorological technique for direct high-speed measurements of the transport of gases and energy between land or water surfaces and the atmosphere [1]. This method allows for observations of gas transport scales from 20-40 times per second to multiple years, represents gas exchange integrated over a large area, from hundreds of square meters to tens of square kilometres, and corresponds to gas exchange from the entire surface, including canopy, and soil or water layers. Gas fluxes, emission and exchange rates are characterized from single-point in situ measurements using permanent or mobile towers, or moving platforms such as automobiles, helicopters, airplanes, etc. Presently, over 600 eddy covariance stations are in operation in over 120 countries [1]. EC is now recognized as an effective method in regulatory and industrial applications, including CCUS [2-10]. Emerging projects utilize EC to continuously monitor large areas before and after the injections, to locate and quantify leakages where CO2 may escape from the subsurface, to improve storage efficiency, and for other CCUS characterizations [5-10]. Although EC is one of the most direct and defensible micrometeorological techniques measuring gas emission and transport, and complete automated stations and processing are readily available, the method is mathematically complex, and requires careful setup and execution specific to the site and project. With this in mind, step-by-step instructions were created in [1] to introduce a novice to the EC method, and to assist in further understanding of the method through more advanced references. In this presentation we provide brief highlights of the eddy covariance method, its application to geological carbon capture, utilization and storage, key requirements, instrumentation and software, and review educational resources particularly useful for carbon sequestration research. References: [1] Burba G. Eddy Covariance Method for Scientific, Industrial, Agricultural and Regulatory Applications. LI-COR Biosciences; 2013. [2] International Energy Agency. Quantification techniques for CO2 leakage. IEA-GHG; 2012. [3] US Department of Energy. Best Practices for Monitoring, Verification, and Accounting of CO2 Stored in Deep Geologic Formations. US DOE; 2012. [4] Liu G. (Ed.). Greenhouse Gases: Capturing, Utilization and Reduction. Intech; 2012. [5] Finley R. et al. An Assessment of Geological Carbon Sequestration Options in the Illinois Basin - Phase III. DOE-MGSC; DE-FC26-05NT42588; 2012. [6] LI-COR Biosciences. Surface Monitoring for Geologic Carbon Sequestration. LI-COR, 980-11916, 2011. [7] Lewicki J., Hilley G. Eddy covariance mapping and quantification of surface CO2 leakage fluxes. GRL, 2009; 36: L21802. [8] Finley R. An Assessment of Geological Carbon Sequestration in the Illinois Basin. Overview of the Decatur-Illinois Basin Site. DOE-MGSC; 2009. [9] Eggleston H., et al. (Eds). IPCC Guidelines for National Greenhouse Gas Inventories, IPCC NGGI P, WMO/UNEP; 2006-2011. [10] Burba G., Madsen R., Feese K. Eddy Covariance Method for CO2 Emission Measurements in CCUS Applications: Principles, Instrumentation and Software. Energy Procedia; Submitted: 1-8.
Test Driven Development of Scientific Models
NASA Technical Reports Server (NTRS)
Clune, Thomas L.
2014-01-01
Test-Driven Development (TDD), a software development process that promises many advantages for developer productivity and software reliability, has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices.After a brief overview of the TDD process and my experience in applying the methodology for development activities at Goddard, I will delve more deeply into some of the challenges that are posed by numerical and scientific software as well as tools and implementation approaches that should address those challenges.
Ffuzz: Towards full system high coverage fuzz testing on binary executables.
Zhang, Bin; Ye, Jiaxi; Bi, Xing; Feng, Chao; Tang, Chaojing
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.
GeMS: an advanced software package for designing synthetic genes.
Jayaraj, Sebastian; Reid, Ralph; Santi, Daniel V
2005-01-01
A user-friendly, advanced software package for gene design is described. The software comprises an integrated suite of programs-also provided as stand-alone tools-that automatically performs the following tasks in gene design: restriction site prediction, codon optimization for any expression host, restriction site inclusion and exclusion, separation of long sequences into synthesizable fragments, T(m) and stem-loop determinations, optimal oligonucleotide component design and design verification/error-checking. The output is a complete design report and a list of optimized oligonucleotides to be prepared for subsequent gene synthesis. The user interface accommodates both inexperienced and experienced users. For inexperienced users, explanatory notes are provided such that detailed instructions are not necessary; for experienced users, a streamlined interface is provided without such notes. The software has been extensively tested in the design and successful synthesis of over 400 kb of genes, many of which exceeded 5 kb in length.
NASA Technical Reports Server (NTRS)
Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell
1991-01-01
The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.
Reporting Differences Between Spacecraft Sequence Files
NASA Technical Reports Server (NTRS)
Khanampompan, Teerapat; Gladden, Roy E.; Fisher, Forest W.
2010-01-01
A suite of computer programs, called seq diff suite, reports differences between the products of other computer programs involved in the generation of sequences of commands for spacecraft. These products consist of files of several types: replacement sequence of events (RSOE), DSN keyword file [DKF (wherein DSN signifies Deep Space Network)], spacecraft activities sequence file (SASF), spacecraft sequence file (SSF), and station allocation file (SAF). These products can include line numbers, request identifications, and other pieces of information that are not relevant when generating command sequence products, though these fields can result in the appearance of many changes to the files, particularly when using the UNIX diff command to inspect file differences. The outputs of prior software tools for reporting differences between such products include differences in these non-relevant pieces of information. In contrast, seq diff suite removes the fields containing the irrelevant pieces of information before processing to extract differences, so that only relevant differences are reported. Thus, seq diff suite is especially useful for reporting changes between successive versions of the various products and in particular flagging difference in fields relevant to the sequence command generation and review process.
Developing of an automation for therapy dosimetry systems by using labview software
NASA Astrophysics Data System (ADS)
Aydin, Selim; Kam, Erol
2018-06-01
Traceability, accuracy and consistency of radiation measurements are essential in radiation dosimetry, particularly in radiotherapy, where the outcome of treatments is highly dependent on the radiation dose delivered to patients. Therefore it is very important to provide reliable, accurate and fast calibration services for therapy dosimeters since the radiation dose delivered to a radiotherapy patient is directly related to accuracy and reliability of these devices. In this study, we report the performance of in-house developed computer controlled data acquisition and monitoring software for the commercially available radiation therapy electrometers. LabVIEW® software suite is used to provide reliable, fast and accurate calibration services. The software also collects environmental data such as temperature, pressure and humidity in order to use to use these them in correction factor calculations. By using this software tool, a better control over the calibration process is achieved and the need for human intervention is reduced. This is the first software that can control frequently used dosimeter systems, in radiation thereapy field at hospitals, such as Unidos Webline, Unidos E, Dose-1 and PC Electrometers.
Identification of MS-Cleavable and Non-Cleavable Chemically Crosslinked Peptides with MetaMorpheus.
Lu, Lei; Millikin, Robert J; Solntsev, Stefan K; Rolfs, Zach; Scalf, Mark; Shortreed, Michael R; Smith, Lloyd M
2018-05-25
Protein chemical crosslinking combined with mass spectrometry has become an important technique for the analysis of protein structure and protein-protein interactions. A variety of crosslinkers are well developed, but reliable, rapid, and user-friendly tools for large-scale analysis of crosslinked proteins are still in need. Here we report MetaMorpheusXL, a new search module within the MetaMorpheus software suite that identifies both MS-cleavable and non-cleavable crosslinked peptides in MS data. MetaMorpheusXL identifies MS-cleavable crosslinked peptides with an ion-indexing algorithm, which enables an efficient large database search. The identification does not require the presence of signature fragment ions, an advantage compared to similar programs such as XlinkX. One complication associated with the need for signature ions from cleavable crosslinkers such as DSSO (disuccinimidyl sulfoxide) is the requirement for multiple fragmentation types and energy combinations, which is not necessary for MetaMorpheusXL. The ability to perform proteome-wide analysis is another advantage of MetaMorpheusXl compared to such programs as MeroX and DXMSMS. MetaMorpheusXL is also faster than other currently available MS-cleavable crosslink search software programs. It is imbedded in MetaMorpheus, an open-source and freely available software suite that provides a reliable, fast, user-friendly graphical user interface that is readily accessible to researchers.
NASA Astrophysics Data System (ADS)
Martin, T.; Drissen, L.; Joncas, G.
2015-09-01
SITELLE (installed in 2015 at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont-Mégantic) are the first Imaging Fourier Transform Spectrometers (IFTS) capable of obtaining a hyperspectral data cube which samples a 12 arc minutes field of view into four millions of visible spectra. The result of each observation is made up of two interferometric data cubes which need to be merged, corrected, transformed and calibrated in order to get a spectral cube of the observed region ready to be analysed. ORBS is a fully automatic data reduction software that has been entirely designed for this purpose. The data size (up to 68 Gb for larger science cases) and the computational needs have been challenging and the highly parallelized object-oriented architecture of ORBS reflects the solutions adopted which made possible to process 68 Gb of raw data in less than 11 hours using 8 cores and 22.6 Gb of RAM. It is based on a core framework (ORB) that has been designed to support the whole software suite for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS). They all aim to provide a strong basis for the creation and development of specialized analysis modules that could benefit the scientific community working with SITELLE and SpIOMM.
de Souza, John Kennedy Schettino; Pinto, Marcos Antonio da Silva; Vieira, Pedro Gabrielle; Baron, Jerome; Tierra-Criollo, Carlos Julio
2013-12-01
The dynamic, accurate measurement of pupil size is extremely valuable for studying a large number of neuronal functions and dysfunctions. Despite tremendous and well-documented progress in image processing techniques for estimating pupil parameters, comparatively little work has been reported on practical hardware issues involved in designing image acquisition systems for pupil analysis. Here, we describe and validate the basic features of such a system which is based on a relatively compact, off-the-shelf, low-cost FireWire digital camera. We successfully implemented two configurable modes of video record: a continuous mode and an event-triggered mode. The interoperability of the whole system is guaranteed by a set of modular software components hosted on a personal computer and written in Labview. An offline analysis suite of image processing algorithms for automatically estimating pupillary and eyelid parameters were assessed using data obtained in human subjects. Our benchmark results show that such measurements can be done in a temporally precise way at a sampling frequency of up to 120 Hz and with an estimated maximum spatial resolution of 0.03 mm. Our software is made available free of charge to the scientific community, allowing end users to either use the software as is or modify it to suit their own needs. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Network Monitor and Control of Disruption-Tolerant Networks
NASA Technical Reports Server (NTRS)
Torgerson, J. Leigh
2014-01-01
For nearly a decade, NASA and many researchers in the international community have been developing Internet-like protocols that allow for automated network operations in networks where the individual links between nodes are only sporadically connected. A family of Disruption-Tolerant Networking (DTN) protocols has been developed, and many are reaching CCSDS Blue Book status. A NASA version of DTN known as the Interplanetary Overlay Network (ION) has been flight-tested on the EPOXI spacecraft and ION is currently being tested on the International Space Station. Experience has shown that in order for a DTN service-provider to set up a large scale multi-node network, a number of network monitor and control technologies need to be fielded as well as the basic DTN protocols. The NASA DTN program is developing a standardized means of querying a DTN node to ascertain its operational status, known as the DTN Management Protocol (DTNMP), and the program has developed some prototypes of DTNMP software. While DTNMP is a necessary component, it is not sufficient to accomplish Network Monitor and Control of a DTN network. JPL is developing a suite of tools that provide for network visualization, performance monitoring and ION node control software. This suite of network monitor and control tools complements the GSFC and APL-developed DTN MP software, and the combined package can form the basis for flight operations using DTN.
A gesture-controlled projection display for CT-guided interventions.
Mewes, A; Saalfeld, P; Riabikin, O; Skalej, M; Hansen, C
2016-01-01
The interaction with interventional imaging systems within a sterile environment is a challenging task for physicians. Direct physician-machine interaction during an intervention is rather limited because of sterility and workspace restrictions. We present a gesture-controlled projection display that enables a direct and natural physician-machine interaction during computed tomography (CT)-based interventions. Therefore, a graphical user interface is projected on a radiation shield located in front of the physician. Hand gestures in front of this display are captured and classified using a leap motion controller. We propose a gesture set to control basic functions of intervention software such as gestures for 2D image exploration, 3D object manipulation and selection. Our methods were evaluated in a clinically oriented user study with 12 participants. The results of the performed user study confirm that the display and the underlying interaction concept are accepted by clinical users. The recognition of the gestures is robust, although there is potential for improvements. The gesture training times are less than 10 min, but vary heavily between the participants of the study. The developed gestures are connected logically to the intervention software and intuitive to use. The proposed gesture-controlled projection display counters current thinking, namely it gives the radiologist complete control of the intervention software. It opens new possibilities for direct physician-machine interaction during CT-based interventions and is well suited to become an integral part of future interventional suites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malczynski, Leonard A.
This guide addresses software quality in the construction of Powersim{reg_sign} Studio 8 system dynamics simulation models. It is the result of almost ten years of experience with the Powersim suite of system dynamics modeling tools (Constructor and earlier Studio versions). It is a guide that proposes a common look and feel for the construction of Powersim Studio system dynamics models.
The Kamusi Project Edit Engine: A Tool for Collaborative Lexicography.
ERIC Educational Resources Information Center
Benjamin, Martin; Biersteker, Ann
2001-01-01
Discusses the design and implementation of the Kamusi Project Edit Engine, a Web-based software system uniquely suited to the needs of Swahili collaborative lexicography. Describes the edit engine, including organization of the lexicon and the mechanics by which participants use the system, discusses philosophical issues confronted in the design,…
The Effects of Teacher Directed Writing Instruction Combined with SOLO Literacy Suite
ERIC Educational Resources Information Center
Park, Y.; Ambrose, G.; Coleman, M. B.; Moore, T. C.
2017-01-01
The purpose of this study was to examine the effectiveness of an intervention in which teacher-led instruction was combined with computerized writing software to improve paragraph writing for three middle school students with intellectual disability. A multiple probe across participants design was used to evaluate the effectiveness of the…
2003-03-01
within the Automated Cost Estimating Integrated Tools ( ACEIT ) software suite (version 5.x). With this capability, one can set cost targets or time...not allow the user to vary more than one decision variable. This limitation of the ACEIT approach thus hinders a holistic view when attempting to
Project-Method Fit: Exploring Factors That Influence Agile Method Use
ERIC Educational Resources Information Center
Young, Diana K.
2013-01-01
While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…
Modular Open-Source Software for Item Factor Analysis
ERIC Educational Resources Information Center
Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.
2015-01-01
This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
... software to Anschutz Entertainment Group, Inc., to divest Paciolan, Inc. to Comcast-Spectacor, L.P. or... Justice, Antitrust Division, Antitrust Documents Group, 450 Fifth Street, NW., Suite 1010, Washington, DC... Justice, Hoover Office Building- Second Floor, 1305 East Walnut Street, Des Moines, IA 50319; State of...
R-WISE: A Computerized Environment for Tutoring Critical Literacy.
ERIC Educational Resources Information Center
Carlson, P.; Crevoisier, M.
This paper describes a computerized environment for teaching the conceptual patterns of critical literacy. While the full implementation of the software covers both reading and writing, this paper covers only the writing aspects of R-WISE (Reading and Writing in a Supportive Environment). R-WISE consists of a suite of computerized…
Personal computer security: part 1. Firewalls, antivirus software, and Internet security suites.
Caruso, Ronald D
2003-01-01
Personal computer (PC) security in the era of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) involves two interrelated elements: safeguarding the basic computer system itself and protecting the information it contains and transmits, including personal files. HIPAA regulations have toughened the requirements for securing patient information, requiring every radiologist with such data to take further precautions. Security starts with physically securing the computer. Account passwords and a password-protected screen saver should also be set up. A modern antivirus program can easily be installed and configured. File scanning and updating of virus definitions are simple processes that can largely be automated and should be performed at least weekly. A software firewall is also essential for protection from outside intrusion, and an inexpensive hardware firewall can provide yet another layer of protection. An Internet security suite yields additional safety. Regular updating of the security features of installed programs is important. Obtaining a moderate degree of PC safety and security is somewhat inconvenient but is necessary and well worth the effort. Copyright RSNA, 2003
NASA Technical Reports Server (NTRS)
Hamilton, George S.; Williams, Jermaine C.
1998-01-01
This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.
MNE Scan: Software for real-time processing of electrophysiological data.
Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph
2018-06-01
Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Mayer, Richard
1988-01-01
The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.
End effector monitoring system: An illustrated case of operational prototyping
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Land, Sherry A.; Thronesbery, Carroll
1994-01-01
Operational prototyping is introduced to help developers apply software innovations to real-world problems, to help users articulate requirements, and to help develop more usable software. Operational prototyping has been applied to an expert system development project. The expert system supports fault detection and management during grappling operations of the Space Shuttle payload bay arm. The dynamic exchanges among operational prototyping team members are illustrated in a specific prototyping session. We discuss the requirements for operational prototyping technology, types of projects for which operational prototyping is best suited and when it should be applied to those projects.
Exploring Digisonde Ionogram Data with SAO-X and DIDBase
NASA Astrophysics Data System (ADS)
Khmyrov, Grigori M.; Galkin, Ivan A.; Kozlov, Alexander V.; Reinisch, Bodo W.; McElroy, Jonathan; Dozois, Claude
2008-02-01
A comprehensive suite of software tools for ionogram data analysis and archiving has been developed at UMLCAR to support the exploration of raw and processed data from the worldwide network of digisondes in a low-latency, user-friendly environment. Paired with the remotely accessible Digital Ionogram Data Base (DIDBase), the SAO Explorer software serves as an example of how an academic institution conscientiously manages its resident data archive while local experts continue to work on design of new and improved data products, all in the name of free public access to the full roster of acquired ionospheric sounding data.
Alvarez, Alejandro; Borgia, Francesco; Guccione, Paolo
2010-02-01
We describe an infant of 8 months who presented with left ventricular dilation due to an extensive intralobar sequestration of the right lung. The pulmonary sequestration was associated with a patent arterial duct and a right aortic arch. Percutaneous closure of the anomalous aberrant artery feeding the sequestrated lung resulted in prompt regression of the left ventricular enlargement.
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar; Blackledge, Christopher; Ferrer, Mike; Margerum, Sarah
2009-01-01
The designers of the Orion Crew Exploration Vehicle (CEV) utilize an intensive simulation program in order to predict the launch and landing characteristics of the Crew Impact Attenuation System (CIAS). The CIAS is the energy absorbing strut concept that dampens loads to levels sustainable by the crew during landing and consists of the crew module seat pallet that accommodates four to six seated astronauts. An important parameter required for proper dynamic modeling of the CIAS is knowledge of the suited center of mass (COM) variations within the crew population. Significant center of mass variations across suited crew configurations would amplify the inertial effects of the pallet and potentially create unacceptable crew loading during launch and landing. Established suited, whole-body, and posture-based mass properties were not available due to the uncertainty of the final CEV seat posture and suit hardware configurations. While unsuited segmental center of mass values can be obtained via regression equations from previous studies, building them into a model that was posture dependent with custom anthropometry and integrated suit components proved cumbersome and time consuming. Therefore, the objective of this study was to quantify the effects of posture, suit components, and the expected range of anthropometry on the center of mass of a seated individual. Several elements are required for the COM calculation of a suited human in a seated position: anthropometry; body segment mass; suit component mass; suit component location relative to the body; and joint angles defining the seated posture. Anthropometry and body segment masses used in this study were taken from a selection of three-dimensional human body models, called boundary manikins, which were developed in a previous project. These boundary manikins represent the critical anthropometric dimension extremes for the anticipated astronaut population. Six male manikins and 6 female manikins, representing a subset of the possible maximum and minimum sized crewmembers, were segmented using point-cloud software to create 17 major body segments. The general approach used to calculate the human mass properties was to utilize center of volume outputs from the software for each body segment and apply a homogeneous density function to determine segment mass 3-D coordinates. Suit components, based on the current consensus regarding predicted suit configuration values, were treated as point masses and were positioned using vector mathematics along the body segments based on anthropometry and COM position. A custom MATLAB script then articulates the body segment and suit positions into a selected seated configuration, using joint angles that characterize a standard seated position and a CEV specific seated position. Additional MATLAB(r) scripts are finally used to calculate the composite COM positions in 3-D space for all 12 manikins in both suited and unsuited conditions for both seated configurations. The analysis focused on two aspects: (1) to quantify how much the whole body COM varied from the smallest to largest subject and (2) the impacts of the suit components on the overall COM in each seat configuration. The location across all boundary manikins of the anterior- posterior COM varied by approximately 7cm, the vertical COM varied by approximately 9-10cm, and the mediolateral COM varied by approximately 1.2 cm from the midline sagittal plane for both seat configurations. This variation was surprisingly large given the relative proportionality of the mass distribution of the human body. The suit components caused an anterior shift of the total COM by approximately 2 cm and a shift to the right along the mediolateral axis of 0.4 cm for both seat configurations. When the seat configuration is in the standard posture, the suited vertical COM shifts inferiorly by up to 1 cm whereas in the CEV posture the vertical COM has no appreciable change. These general differences were due the high proportion of suit mass located in the boots and lower legs and their corresponding distance from the body COM as well as the prevalence of suit components on the right side of the body.
Sentis, Manuel Lorenzo; Gable, Carl W.
2017-06-15
Furthermore, there are many applications in science and engineering modeling where an accurate representation of a complex model geometry in the form of a mesh is important. In applications of flow and transport in subsurface porous media, this is manifest in models that must capture complex geologic stratigraphy, structure (faults, folds, erosion, deposition) and infrastructure (tunnels, boreholes, excavations). Model setup, defined as the activities of geometry definition, mesh generation (creation, optimization, modification, refine, de-refine, smooth), assigning material properties, initial conditions and boundary conditions requires specialized software tools to automate and streamline the process. In addition, some model setup tools willmore » provide more utility if they are designed to interface with and meet the needs of a particular flow and transport software suite. A control volume discretization that uses a two point flux approximation is for example most accurate when the underlying control volumes are 2D or 3D Voronoi tessellations. In this paper we will present the coupling of LaGriT, a mesh generation and model setup software suite and TOUGH2 to model subsurface flow problems and we show an example of how LaGriT can be used as a model setup tool for the generation of a Voronoi mesh for the simulation program TOUGH2. To generate the MESH file for TOUGH2 from the LaGriT output a standalone module Lagrit2Tough2 was developed, which is presented here and will be included in a future release of LaGriT. Here in this paper an alternative method to generate a Voronoi mesh for TOUGH2 with LaGriT is presented and thanks to the modular and command based structure of LaGriT this method is well suited to generating a mesh for complex models.« less
NASA Astrophysics Data System (ADS)
Sentís, Manuel Lorenzo; Gable, Carl W.
2017-11-01
There are many applications in science and engineering modeling where an accurate representation of a complex model geometry in the form of a mesh is important. In applications of flow and transport in subsurface porous media, this is manifest in models that must capture complex geologic stratigraphy, structure (faults, folds, erosion, deposition) and infrastructure (tunnels, boreholes, excavations). Model setup, defined as the activities of geometry definition, mesh generation (creation, optimization, modification, refine, de-refine, smooth), assigning material properties, initial conditions and boundary conditions requires specialized software tools to automate and streamline the process. In addition, some model setup tools will provide more utility if they are designed to interface with and meet the needs of a particular flow and transport software suite. A control volume discretization that uses a two point flux approximation is for example most accurate when the underlying control volumes are 2D or 3D Voronoi tessellations. In this paper we will present the coupling of LaGriT, a mesh generation and model setup software suite and TOUGH2 (Pruess et al., 1999) to model subsurface flow problems and we show an example of how LaGriT can be used as a model setup tool for the generation of a Voronoi mesh for the simulation program TOUGH2. To generate the MESH file for TOUGH2 from the LaGriT output a standalone module Lagrit2Tough2 was developed, which is presented here and will be included in a future release of LaGriT. In this paper an alternative method to generate a Voronoi mesh for TOUGH2 with LaGriT is presented and thanks to the modular and command based structure of LaGriT this method is well suited to generating a mesh for complex models.
Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.
2016-01-01
The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.
Geophysical monitoring technology for CO2 sequestration
NASA Astrophysics Data System (ADS)
Ma, Jin-Feng; Li, Lin; Wang, Hao-Fan; Tan, Ming-You; Cui, Shi-Ling; Zhang, Yun-Yin; Qu, Zhi-Peng; Jia, Ling-Yun; Zhang, Shu-Hai
2016-06-01
Geophysical techniques play key roles in the measuring, monitoring, and verifying the safety of CO2 sequestration and in identifying the efficiency of CO2-enhanced oil recovery. Although geophysical monitoring techniques for CO2 sequestration have grown out of conventional oil and gas geophysical exploration techniques, it takes a long time to conduct geophysical monitoring, and there are many barriers and challenges. In this paper, with the initial objective of performing CO2 sequestration, we studied the geophysical tasks associated with evaluating geological storage sites and monitoring CO2 sequestration. Based on our review of the scope of geophysical monitoring techniques and our experience in domestic and international carbon capture and sequestration projects, we analyzed the inherent difficulties and our experiences in geophysical monitoring techniques, especially, with respect to 4D seismic acquisition, processing, and interpretation.
Study on the methodology of road carbon sink forest
NASA Astrophysics Data System (ADS)
Wan, Lijuan; Zhang, Yi; Cheng, Dongxiang; Huang, Yanan
2017-01-01
Advanced concepts of forest carbon sink and forestry carbon sequestration are introduced in road carbon sink forest project and the measurement and carbon monitoring of road carbon sink forest are explored. Experience and technology are accumulated and a set of the carbon sequestration forestation and carbon measurement and monitoring technology systems on both sides of road are formed. To update the green concept, improve the forestation quality along road and to enhanced sequestration and ecological efficiency, it is important to realize the traffic low carbon and energy saving and emission reduction. To use scientific planting and monitoring methods, soil properties, carbon sequestration of soil organic carbon pool, and carbon sequestration capacity of different species of trees were studied and monitored. High carbon sequestration species selection, silvicultural management, measurement of carbon sink and carbon monitoring are explored.
Si, Yong; Wang, Lihong; Huang, Xiaohua
2018-01-01
REEs in the environment can be absorbed by plants and sequestered by plant phytoliths. Acid rain can directly or indirectly affect plant physiological functions. Currently, the effects of REEs and acid rain on phytolith-REEs complex in plants are not yet fully understood. In this study, a high-silicon accumulation crop, rice (Oryza sativa L.), was selected as a representative of plants, and orthogonal experiments were conducted under various levels of lanthanum [La(III)] and pH. The results showed that various La(III) concentrations could significantly improve the efficiency and sequestration of phytolith La(III) in germinated rice seeds. A pH of 4.5 promoted phytolith La(III) sequestration, while a pH of 3.5 inhibited sequestration. Compared with the single treatment with La(III), the combination of La(III) and acid rain inhibited the efficiency and sequestration of phytolith La(III). Correlation analysis showed that the efficiency of phytolith La(III) sequestration had no correlation with the production of phytolith but was closely correlated with the sequestration of phytolith La(III) and the physiological changes of germinated rice seeds. Phytolith morphology was an important factor affecting phytolith La(III) sequestration in germinated rice seeds, and the effect of tubes on sequestration was more significant than that of dumbbells. This study demonstrated that the formation of the phytolith and La(III) complex could be affected by exogenous La(III) and acid rain in germinated rice seeds. PMID:29763463
Three-dimensional modeling of the cochlea by use of an arc fitting approach.
Schurzig, Daniel; Lexow, G Jakob; Majdani, Omid; Lenarz, Thomas; Rau, Thomas S
2016-12-01
A cochlea modeling approach is presented allowing for a user defined degree of geometry simplification which automatically adjusts to the patient specific anatomy. Model generation can be performed in a straightforward manner due to error estimation prior to the actual generation, thus minimizing modeling time. Therefore, the presented technique is well suited for a wide range of applications including finite element analyses where geometrical simplifications are often inevitable. The method is presented for n=5 cochleae which were segmented using a custom software for increased accuracy. The linear basilar membrane cross sections are expanded to areas while the scalae contours are reconstructed by a predefined number of arc segments. Prior to model generation, geometrical errors are evaluated locally for each cross section as well as globally for the resulting models and their basal turn profiles. The final combination of all reconditioned features to a 3D volume is performed in Autodesk Inventor using the loft feature. Due to the volume generation based on cubic splines, low errors could be achieved even for low numbers of arc segments and provided cross sections, both of which correspond to a strong degree of model simplification. Model generation could be performed in a time efficient manner. The proposed simplification method was proven to be well suited for the helical cochlea geometry. The generated output data can be imported into commercial software tools for various analyses representing a time efficient way to create cochlea models optimally suited for the desired task.
Wettability shifts caused by CO2 aging on mineral surfaces
NASA Astrophysics Data System (ADS)
Liang, B.; Clarens, A. F.
2015-12-01
Interfacial forces at the CO2/brine/mineral ternary interface have a well-established impact on multiphase flow properties through porous media. In the context of geologic carbon sequestration, this wettability will impact capillary pressure, residual trapping, and a variety of other key parameters of interest. While the wettability of CO2 on pure mineral and real rock sample have been studied a great deal over the past few year, very little is known about how the wettability of these rocks could change over long time horizons as CO2 interacts with species in the brine and on the mineral surface. In this work we sought to explore the role that dilute inorganic and organic species that are likely to exist in connate brines might have on a suite of mineral species. High-pressure contact angle experiments were carried out on a suite of polished mineral surfaces. Both static captive bubble and advancing/receding contact angle measurements were carried out. The effect of ionic strength, and in particular the valence of the dominant ions in the brine are found to have an important impact on the wettability which cannot be explained solely based on the shifts in the interfacial tension between the CO2 and brine. More significantly, three organic species, formate, acetate, and oxalate, all three of which are representative species commonly encountered in the saline aquifers that are considered target repositories for carbon sequestration. All three organic species show impacts on wettability, with the organics generally increasing the CO2 wetting of the mineral surface. Not all pure minerals respond the same to the presence of organics, with micas showing a more pronounced influence than quartz. Sandstone and limestone samples aged with different kinds of hydrocarbons, a surrogate for oil-bearing rocks, are generally more CO2-wet, with larger contact angles in the CO2/brine system. Over multiple days, the contact angle decreases, which could be attributed to partitioning of oil films off of the surface and into the CO2 phase, which drives the wettability towards the original water-wet state. This effect could be particularly important for organic rich repositories like depleted oil and gas fields or fractured shale formations where organic species could be presented both on mineral surfaces and in the aqueous phase.
A Policy Option To Provide Sufficient Funding For Massive-Scale Sequestration of CO2
NASA Astrophysics Data System (ADS)
Kithil, P. W.
2007-12-01
Global emissions of CO2 now are nearly 30 billion tons per year, and are growing rapidly due to strong economic growth. Atmospheric levels of CO2 have reached 380 ppm and recent reports suggest the rate of increase has gone from 1% per year in the 1990's to 3% per year now - with potential to cross 550ppm in the 2020 decade. Without stabilization of atmospheric CO2 below 550ppm, climate models predict unacceptably higher average temperatures with significant risk of runaway global warming this century. While there is much talk about reducing CO2 emissions by switching to non-fossil energy sources, imposing energy efficiency, and a host of other changes, there are no new large-scale energy sources on the horizon. The options are to impose draconian cuts in fossil energy consumption that will keep us below 550ppm (devastating the global economy) - or to adopt massive-scale sequestration of CO2. Three approaches are feasible: biological ocean sequestration, geologic sequestration, and biological terrestrial sequestration. Biological sequestration is applicable to all CO2 sources, whereas geologic sequestration is limited to fossil-fuel power plants and some large point-source emitters such as cement plants and large industrial facilities. Sequestration provides a direct mechanism for reducing atmospheric levels of CO2, whereas offsetting technologies such as wind power or improved efficiency, reduce the need for more fossil fuels but do not physically remove CO2 from the environment. The primary geologic technique, carbon capture & sequestration (CCS), prevents CO2 from entering the atmosphere but likewise does not reduce existing levels of atmospheric CO2. Biological sequestration (ocean or terrestrial) physically removes CO2 from the atmosphere. Since we cannot shut down our global economy, urgent action is needed to counteract CO2 emissions, and avoid catastrophic climate change. Given the long lead time and/or small impact of offsetting energy sources, sequestration is the only way to achieve near and medium-term reductions in atmospheric CO2 levels. To finance massive-scale sequestration of CO2, we propose the World Trade Organization (WTO) become an active player in the sequestration market. Given the WTO's role as overseer of international trade agreements annually representing 30 trillion in imports and exports of goods and services, it is by far the largest global economic force and therefore offers the broadest economic base. Absent a real solution to CO2 emissions, the global economy - and world trade - will shrink dramatically. The WTO can jumpstart the market for CO2 sequestration by issuing long term contracts to purchase bona fide sequestration-derived CO2 credits. Under this proposal, an initial price of 100 per ton which steps-down by 5% per year could bring forth the sequestration investment needed to achieve upwards of 10 billion tons sequestered CO2 per year by 2025 (seven billion tons from biological ocean sequestration and at least three billion tons from geologic and terrestrial sequestration). Assuming a contract term of 40 years, and a parallel commodity market continues to develop for CO2 credits, at some time in the future the WTO's contractual price will be less than the commodity market price - and the WTO begins to recover its investment. Under one set of assumptions, the net WTO annual subsidy would peak at $86 billion by 2022, equal to an across-the-board WTO tariff on imports and exports of about 1.01%, then become positive a few years later as the market price climbed above WTO's contracted price. Under this proposal, the WTO effectively subsidizes CO2 sequestration in the near to medium term and then recoups its investment and reaps large profits over the long term.
V-SUIT Model Validation Using PLSS 1.0 Test Results
NASA Technical Reports Server (NTRS)
Olthoff, Claas
2015-01-01
The dynamic portable life support system (PLSS) simulation software Virtual Space Suit (V-SUIT) has been under development at the Technische Universitat Munchen since 2011 as a spin-off from the Virtual Habitat (V-HAB) project. The MATLAB(trademark)-based V-SUIT simulates space suit portable life support systems and their interaction with a detailed and also dynamic human model, as well as the dynamic external environment of a space suit moving on a planetary surface. To demonstrate the feasibility of a large, system level simulation like V-SUIT, a model of NASA's PLSS 1.0 prototype was created. This prototype was run through an extensive series of tests in 2011. Since the test setup was heavily instrumented, it produced a wealth of data making it ideal for model validation. The implemented model includes all components of the PLSS in both the ventilation and thermal loops. The major components are modeled in greater detail, while smaller and ancillary components are low fidelity black box models. The major components include the Rapid Cycle Amine (RCA) CO2 removal system, the Primary and Secondary Oxygen Assembly (POS/SOA), the Pressure Garment System Volume Simulator (PGSVS), the Human Metabolic Simulator (HMS), the heat exchanger between the ventilation and thermal loops, the Space Suit Water Membrane Evaporator (SWME) and finally the Liquid Cooling Garment Simulator (LCGS). Using the created model, dynamic simulations were performed using same test points also used during PLSS 1.0 testing. The results of the simulation were then compared to the test data with special focus on absolute values during the steady state phases and dynamic behavior during the transition between test points. Quantified simulation results are presented that demonstrate which areas of the V-SUIT model are in need of further refinement and those that are sufficiently close to the test results. Finally, lessons learned from the modelling and validation process are given in combination with implications for the future development of other PLSS models in V-SUIT.
An Overview of Geologic Carbon Sequestration Potential in California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cameron Downey; John Clinkenbeard
2005-10-01
As part of the West Coast Regional Carbon Sequestration Partnership (WESTCARB), the California Geological Survey (CGS) conducted an assessment of geologic carbon sequestration potential in California. An inventory of sedimentary basins was screened for preliminary suitability for carbon sequestration. Criteria included porous and permeable strata, seals, and depth sufficient for critical state carbon dioxide (CO{sub 2}) injection. Of 104 basins inventoried, 27 met the criteria for further assessment. Petrophysical and fluid data from oil and gas reservoirs was used to characterize both saline aquifers and hydrocarbon reservoirs. Where available, well log or geophysical information was used to prepare basin-wide mapsmore » showing depth-to-basement and gross sand distribution. California's Cenozoic marine basins were determined to possess the most potential for geologic sequestration. These basins contain thick sedimentary sections, multiple saline aquifers and oil and gas reservoirs, widespread shale seals, and significant petrophysical data from oil and gas operations. Potential sequestration areas include the San Joaquin, Sacramento, Ventura, Los Angeles, and Eel River basins, followed by the smaller Salinas, La Honda, Cuyama, Livermore, Orinda, and Sonoma marine basins. California's terrestrial basins are generally too shallow for carbon sequestration. However, the Salton Trough and several smaller basins may offer opportunities for localized carbon sequestration.« less
Pommé, S
2012-09-01
A software package is presented to calculate the total counting efficiency for the decay of radionuclides in a well-type γ-ray detector. It is specifically applied to primary standardisation of activity by means of 4πγ-counting with a NaI(Tl) well-type scintillation detector. As an alternative to Monte Carlo simulations, the software combines good accuracy with superior speed and ease-of-use. It is also well suited to investigate uncertainties associated with the 4πγ-counting method for a variety of radionuclides and detector dimensions. In this paper, the underlying analytical models for the radioactive decay and subsequent counting efficiency of the emitted radiation in the detector are summarised. Copyright © 2012 Elsevier Ltd. All rights reserved.
40 CFR 98.448 - Geologic sequestration monitoring, reporting, and verification (MRV) plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Sequestration of Carbon Dioxide § 98.448 Geologic sequestration monitoring, reporting, and verification (MRV... use to calculate site-specific variables for the mass balance equation. This includes, but is not...
40 CFR 98.448 - Geologic sequestration monitoring, reporting, and verification (MRV) plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Sequestration of Carbon Dioxide § 98.448 Geologic sequestration monitoring, reporting, and verification (MRV... use to calculate site-specific variables for the mass balance equation. This includes, but is not...
The Cloud-Based Integrated Data Viewer (IDV)
NASA Astrophysics Data System (ADS)
Fisher, Ward
2015-04-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
NASA Astrophysics Data System (ADS)
Knapp, J. H.; Knapp, C. C.; Brantley, D.; Lakshmi, V.; Howard, S.
2016-12-01
The Southeast Offshore Storage Resource Assessment (SOSRA) project is part of a major new program, funded by the U.S. Department of Energy for the next two and a half years, to evaluate the Atlantic and Gulf of Mexico offshore margins of the United States for geologic storage capacity of CO2. Collaborating organizations include the Southern States Energy Board, Virginia Polytechnic Institute, University of South Carolina, Oklahoma State University, Virginia Department of Mines, Minerals, and Energy, South Carolina Geological Survey, and Geological Survey of Alabama. Team members from South Carolina are focused on the Atlantic offshore, from North Carolina to Florida. Geologic sequestration of CO2 is a major research focus globally, and requires robust knowledge of the porosity and permeability distribution in upper crustal sediments. Using legacy seismic reflection, refraction, and well data from a previous phase of offshore petroleum exploration on the Atlantic margin, we are analyzing the rock physics characteristics of the offshore Mesozoic and Cenozoic stratigraphy on a regional scale from North Carolina to Florida. Major features of the margin include the Carolina Trough, the Southeast Georgia Embayment, the Blake Plateau basin, and the Blake Outer Ridge. Previous studies indicate sediment accumulations on this margin may be as thick as 12-15 km. The study will apply a diverse suite of data analysis techniques designed to meet the goal of predicting storage capacity to within ±30%. Synthetic seismograms and checkshot surveys will be used to tie well and seismic data. Seismic interpretation and geophysical log analysis will employ leading-edge software technology and state-of-the art techniques for stratigraphic and structural interpretation and the definition of storage units and their physical and chemical properties. This approach will result in a robust characterization of offshore CO2 storage opportunities, as well as a volumetric analysis that is consistent with established procedures.
Nelson, Erik; Polasky, Stephen; Lewis, David J.; Plantinga, Andrew J.; Lonsdorf, Eric; White, Denis; Bael, David; Lawler, Joshua J.
2008-01-01
We develop an integrated model to predict private land-use decisions in response to policy incentives designed to increase the provision of carbon sequestration and species conservation across heterogeneous landscapes. Using data from the Willamette Basin, Oregon, we compare the provision of carbon sequestration and species conservation under five simple policies that offer payments for conservation. We evaluate policy performance compared with the maximum feasible combinations of carbon sequestration and species conservation on the landscape for various conservation budgets. None of the conservation payment policies produce increases in carbon sequestration and species conservation that approach the maximum potential gains on the landscape. Our results show that policies aimed at increasing the provision of carbon sequestration do not necessarily increase species conservation and that highly targeted policies do not necessarily do as well as more general policies. PMID:18621703
CellAnimation: an open source MATLAB framework for microscopy assays.
Georgescu, Walter; Wikswo, John P; Quaranta, Vito
2012-01-01
Advances in microscopy technology have led to the creation of high-throughput microscopes that are capable of generating several hundred gigabytes of images in a few days. Analyzing such wealth of data manually is nearly impossible and requires an automated approach. There are at present a number of open-source and commercial software packages that allow the user to apply algorithms of different degrees of sophistication to the images and extract desired metrics. However, the types of metrics that can be extracted are severely limited by the specific image processing algorithms that the application implements, and by the expertise of the user. In most commercial software, code unavailability prevents implementation by the end user of newly developed algorithms better suited for a particular type of imaging assay. While it is possible to implement new algorithms in open-source software, rewiring an image processing application requires a high degree of expertise. To obviate these limitations, we have developed an open-source high-throughput application that allows implementation of different biological assays such as cell tracking or ancestry recording, through the use of small, relatively simple image processing modules connected into sophisticated imaging pipelines. By connecting modules, non-expert users can apply the particular combination of well-established and novel algorithms developed by us and others that are best suited for each individual assay type. In addition, our data exploration and visualization modules make it easy to discover or select specific cell phenotypes from a heterogeneous population. CellAnimation is distributed under the Creative Commons Attribution-NonCommercial 3.0 Unported license (http://creativecommons.org/licenses/by-nc/3.0/). CellAnimationsource code and documentation may be downloaded from www.vanderbilt.edu/viibre/software/documents/CellAnimation.zip. Sample data are available at www.vanderbilt.edu/viibre/software/documents/movies.zip. walter.georgescu@vanderbilt.edu Supplementary data available at Bioinformatics online.
Evolution of the Scope and Capabilities of Uplink Support Software for Mars Surface Operations
NASA Technical Reports Server (NTRS)
Pack, Marc; Laubach, Sharon
2014-01-01
In January of 2004 both of the Mars Exploration Rover spacecraft landed safely, initiating daily surface operations at the Jet Propulsion Laboratory for what was anticipated to be approximately three months of mobile exploration. The longevity of this mission, still ongoing after ten years, has provided not only a tremendous return of scientific data but also the opportunity to refine and improve the methodology by which robotic Mars surface missions are commanded. Since the landing of the Mars Science Laboratory spacecraft in August of 2012, this methodology has been successfully applied to operate a Martian rover which is both similar to, and quite different from, its predecessors. For MER and MSL, daily uplink operations can be most broadly viewed as converting the combined interests of both the science and engineering teams into a spacecraft-safe set of transmittable command files. In order to accomplish these ends a discrete set of mission-critical software tools were developed which not only allowed for conformation to established JPL standards and practices but also enabled innovative technologies specific to each mission. Although these primary programs provided the requisite capabilities for meeting the high-level goals of each distinct phase of the uplink process, there was little in the way of secondary software to support the smooth flow of data from one phase to the next. In order to address this shortcoming a suite of small software tools was developed to aid in phase transitions, as well as to automate some of the more laborious and error-prone aspects of uplink operations. This paper describes the evolution of this software suite, from its initial attempts to merely shorten the duration of the operator's shift, to its current role as an indispensable tool enforcing workflow of the uplink operations process and agilely responding to the new and unexpected challenges of missions which can, and have, lasted many years longer than originally anticipated.
NASA Technical Reports Server (NTRS)
2012-01-01
Topics covered include: Instrument Suite for Vertical Characterization of the Ionosphere-Thermosphere System; Terahertz Radiation Heterodyne Detector Using Two-Dimensional Electron Gas in a GaN Heterostructure; Pattern Recognition Algorithm for High-Sensitivity Odorant Detection in Unknown Environments; Determining Performance Acceptability of Electrochemical Oxygen Sensors; Versatile Controller for Infrared Lamp and Heater Arrays; High-Speed Scanning Interferometer Using CMOS Image Sensor and FPGA Based on Multifrequency Phase-Tracking Detection; Ultra-Low-Power MEMS Selective Gas Sensors; Compact Receiver Front Ends for Submillimeter-Wave Applications; Dynamically Reconfigurable Systolic Array Accelerator; Blocking Losses With a Photon Counter; Motion-Capture-Enabled Software for Gestural Control of 3D Mod; Orbit Software Suite; CoNNeCT Baseband Processor Module Boot Code SoftWare (BCSW); Trajectory Software With Upper Atmosphere Model; ALSSAT Version 6.0; Employing a Grinding Technology to Assess the Microbial Density for Encapsulated Organisms; Demonstration of Minimally Machined Honeycomb Silicon Carbide Mirrors; Polyimide Aerogel Thin Films; Nanoengineered Thermal Materials Based on Carbon Nanotube Array Composites; Composite Laminate With Coefficient of Thermal Expansion Matching D263 Glass; Robust Tensioned Kevlar Suspension Design; Focal Plane Alignment Utilizing Optical CMM; Purifying, Separating, and Concentrating Cells From a Sample Low in Biomass; Virtual Ultrasound Guidance for Inexperienced Operators; Beat-to-Beat Blood Pressure Monitor; Non-Contact Conductivity Measurement for Automated Sample Processing Systems; An MSK Radar Waveform; Telescope Alignment From Sparsely Sampled Wavefront Measurements Over Pupil Subapertures; Method to Remove Particulate Matter from Dusty Gases at Low Pressures; Terahertz Quantum Cascade Laser With Efficient Coupling and Beam Profile; Measurement Via Optical Near-Nulling and Subaperture Stitching; 885-nm Pumped Ceramic Nd:YAG Master Oscillator Power Amplifier Laser System; Airborne Hyperspectral Imaging System; Heat Shield Employing Cured Thermal Protection Material Blocks Bonded in a Large-Cell Honeycomb Matrix; and Asymmetric Supercapacitor for Long-Duration Power Storage.
NASA Technical Reports Server (NTRS)
Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil
2007-01-01
The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity issues among various NASA systems that impact schedules and planning.
Scripps Genome ADVISER: Annotation and Distributed Variant Interpretation SERver
Pham, Phillip H.; Shipman, William J.; Erikson, Galina A.; Schork, Nicholas J.; Torkamani, Ali
2015-01-01
Interpretation of human genomes is a major challenge. We present the Scripps Genome ADVISER (SG-ADVISER) suite, which aims to fill the gap between data generation and genome interpretation by performing holistic, in-depth, annotations and functional predictions on all variant types and effects. The SG-ADVISER suite includes a de-identification tool, a variant annotation web-server, and a user interface for inheritance and annotation-based filtration. SG-ADVISER allows users with no bioinformatics expertise to manipulate large volumes of variant data with ease – without the need to download large reference databases, install software, or use a command line interface. SG-ADVISER is freely available at genomics.scripps.edu/ADVISER. PMID:25706643
Automated Sensitivity Analysis of Interplanetary Trajectories
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
ERIC Educational Resources Information Center
Hegedus, Stephen J.; Dalton, Sara; Tapper, John R.
2015-01-01
We report on two large studies conducted in advanced algebra classrooms in the US, which evaluated the effect of replacing traditional algebra 2 curriculum with an integrated suite of dynamic interactive software, wireless networks and technology-enhanced curriculum on student learning. The first study was a cluster randomized trial and the second…
Utilizing Software Application Tools to Enhance Online Student Engagement and Achievement
ERIC Educational Resources Information Center
Andersson, David; Reimers, Karl
2010-01-01
The field of education is experiencing a rapid shift as internet-enabled distance learning becomes more widespread. Often, traditional classroom teaching pedagogical techniques can be ill-suited to the online environment. While a traditional entry-level class might see a student attrition rate of 5-10%, the same teaching pedagogy in an online…
Student Perceptions and Experiences Using Jing and Skype in an Accounting Information Systems Class
ERIC Educational Resources Information Center
Charron, Kimberly; Raschke, Robyn
2014-01-01
The authors examine the use of technology to support students in their learning of practical accounting software applications while taking a traditional on-campus class. Specifically, they look at how Jing and Skype are used to facilitate successful completion of a series of simulations using Netsuite (NetSuite, Inc., San Mateo, CA) accounting…
Evaluation of an Anthropometric Human Body Model for Simulated EVA Task Assessment
NASA Technical Reports Server (NTRS)
Etter, Brad
1996-01-01
One of the more mission-critical tasks performed in space is extravehicular activity (EVA) which requires the astronaut to be external to the station or spacecraft, and subsequently at risk from the many threats posed by space. These threats include, but are not limited to: no significant atmosphere, harmful electromagnetic radiation, micrometeoroids, and space debris. To protect the astronaut from this environment, a special EVA suit is worn which is designed to maintain a sustainable atmosphere (at 1/3 atmosphere) and provide protection against the hazards of space. While the EVA suit serves these functions well, it does impose limitations on the astronaut as a consequence of the safety it provides. Since the astronaut is in a virtual vacuum, any atmospheric pressure inside the suit serves to pressurize the suit and restricts mobility of flexible joints (such as fabric). Although some of the EVA suit joints are fixed, rotary-style joints, most of the mobility is achieved by the simple flexibility of the fabric. There are multiple layers of fabric, each of which serves a special purpose in the safety of the astronaut. These multiple layers add to the restriction of motion the astronaut experiences in the space environment. Ground-based testing is implemented to evaluate the capability of EVA-suited astronauts to perform the various tasks in space. In addition to the restriction of motion imposed by the EVA suit, most EVA activity is performed in a micro-gravity (weight less) environment. To simulate weightlessness EVA-suited testing is performed in a neutral buoyancy simulator (NBS). The NBS is composed of a large container of water (pool) in which a weightless environment can be simulated. A subject is normally buoyant in the pressurized suit; however he/she can be made neutrally buoyant with the addition of weights. In addition, most objects the astronaut must interface with in the NBS sink in water and flotation must be added to render them "weightless". The implementation of NBS testing has proven to invaluable in the assessment of EVA activities performed with the Orbiter and is considered to be a key step in the construction of the International Space Station (ISS). While the NBS testing is extremely valuable, it does require considerable overhead to maintain and operate. It has been estimated that the cost of utilizing the facility is approximately $10,000 per day. Therefore it is important to maximize the utility of NBS testing for optimal results. One important aspect to consider in any human/worksite interface is the considerable wealth of anthropometric and ergonomic data available. A subset of this information specific to EVA activity is available in NASA standard 3000. The difficulty in implementing this data is that most of the anthropometric information is represented in a two-dimensional format. This poses some limitations in complete evaluation of the astronaut's capabilities in a three-dimensional environment. Advances in computer hardware and software have provided for three-dimensional design and implementation of hardware with the advance of computer aided design (CAD) software. There are a number of CAD products available and most companies and agencies have adopted CAD as a fundamental aspect of the design process. Another factor which supports the use of CAD is the implementation of computer aided manufacturing (CAM) software and hardware which provides for rapid prototyping and decreases the time to product in the design process. It is probable that most hardware to be accessed by astronauts in EVA or IVA (intravehicular activity) has been designed by a CAD system, and is therefore represented in three-dimensional space for evaluation. Because of the implementation of CAD systems and the movement towards early prototyping, a need has arisen in industry and government for tools which facilitate the evaluation of ergonomic consideration in a three-dimensional environment where the hardware has been designed by the CAD tools. One such product is Jack which was developed by the University of Pennsylvania with funding from several government agencies, including NASA. While the primary purpose of Jack is to model human figures in a ground-based (gravity) environment, it can be utilized to evaluate EVA-suited activities as well. The effects of simulated gravity must be turned off by turning off "behaviors". Although Jack provides human figures for manipulation, the primary instrument to be evaluated for EVA mobility is the work envelope provided by the EVA suit. An EVA Jack suit model has been developed by NASA-JSC and was utilized in this study. This suit model provided a more restrictive motion environment as expected for an EVA suited subject. As part of this study, the anthropometric dimensions for a 50th percentile male were compared with basic anthropometric data and were found to be representative for the population group expected in the NASA flight program. The joints for the suit were created in a manner which provided consistent performance with EVA reach envelopes published in NASA standard #3000.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leetaru, Hannes; Brown, Alan; Lee, Donald
2012-05-01
The Cambro-Ordovician strata of the Illinois and Michigan Basins underlie most of the states of Illinois, Indiana, Kentucky, and Michigan. This interval also extends through much of the Midwest of the United States and, for some areas, may be the only available target for geological sequestration of CO{sub 2}. We evaluated the Cambro-Ordovician strata above the basal Mt. Simon Sandstone reservoir for sequestration potential. The two targets were the Cambrian carbonate intervals in the Knox and the Ordovician St. Peter Sandstone. The evaluation of these two formations was accomplished using wireline data, core data, pressure data, and seismic data frommore » the USDOE-funded Illinois Basin Decatur Project being conducted by the Midwest Geological Sequestration Consortium in Macon County, Illinois. Interpretations were completed using log analysis software, a reservoir flow simulator, and a finite element solver that determines rock stress and strain changes resulting from the pressure increase associated with CO{sub 2} injection. Results of this research suggest that both the St. Peter Sandstone and the Potosi Dolomite (a formation of the Knox) reservoirs may be capable of storing up to 2 million tonnes of CO{sub 2} per year for a 20-year period. Reservoir simulation results for the St. Peter indicate good injectivity and a relatively small CO{sub 2} plume. While a single St. Peter well is not likely to achieve the targeted injection rate of 2 million tonnes/year, results of this study indicate that development with three or four appropriately spaced wells may be sufficient. Reservoir simulation of the Potosi suggest that much of the CO{sub 2} flows into and through relatively thin, high permeability intervals, resulting in a large plume diameter compared with the St. Peter.« less
DOT National Transportation Integrated Search
2012-05-01
Carbon footprints, carbon credits and associated carbon sequestration techniques are rapidly becoming part : of how environmental mitigation business is conducted, not only in Texas but globally. Terrestrial carbon : sequestration is the general term...
Carbon dioxide (CO2) sequestration in deep saline aquifers and formations: Chapter 3
Rosenbauer, Robert J.; Thomas, Burt
2010-01-01
Carbon dioxide (CO2) capture and sequestration in geologic media is one among many emerging strategies to reduce atmospheric emissions of anthropogenic CO2. This chapter looks at the potential of deep saline aquifers – based on their capacity and close proximity to large point sources of CO2 – as repositories for the geologic sequestration of CO2. The petrochemical characteristics which impact on the suitability of saline aquifers for CO2 sequestration and the role of coupled geochemical transport models and numerical tools in evaluating site feasibility are also examined. The full-scale commercial CO2 sequestration project at Sleipner is described together with ongoing pilot and demonstration projects.
Assurance of Fault Management: Risk-Significant Adverse Condition Awareness
NASA Technical Reports Server (NTRS)
Fitz, Rhonda
2016-01-01
Fault Management (FM) systems are ranked high in risk-based assessment of criticality within flight software, emphasizing the importance of establishing highly competent domain expertise to provide assurance for NASA projects, especially as spaceflight systems continue to increase in complexity. Insight into specific characteristics of FM architectures seen embedded within safety- and mission-critical software systems analyzed by the NASA Independent Verification Validation (IVV) Program has been enhanced with an FM Technical Reference (TR) suite. Benefits are aimed beyond the IVV community to those that seek ways to efficiently and effectively provide software assurance to reduce the FM risk posture of NASA and other space missions. The identification of particular FM architectures, visibility, and associated IVV techniques provides a TR suite that enables greater assurance that critical software systems will adequately protect against faults and respond to adverse conditions. The role FM has with regard to overall asset protection of flight software systems is being addressed with the development of an adverse condition (AC) database encompassing flight software vulnerabilities.Identification of potential off-nominal conditions and analysis to determine how a system responds to these conditions are important aspects of hazard analysis and fault management. Understanding what ACs the mission may face, and ensuring they are prevented or addressed is the responsibility of the assurance team, which necessarily should have insight into ACs beyond those defined by the project itself. Research efforts sponsored by NASAs Office of Safety and Mission Assurance defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs, and allowing queries based on project, mission type, domain component, causal fault, and other key characteristics. The repository has a firm structure, initial collection of data, and an interface established for informational queries, with plans for integration within the Enterprise Architecture at NASA IVV, enabling support and accessibility across the Agency. The development of an improved workflow process for adaptive, risk-informed FM assurance is currently underway.
Making carbon sequestration a paying proposition
NASA Astrophysics Data System (ADS)
Han, Fengxiang X.; Lindner, Jeff S.; Wang, Chuji
2007-03-01
Atmospheric carbon dioxide (CO2) has increased from a preindustrial concentration of about 280 ppm to about 367 ppm at present. The increase has closely followed the increase in CO2 emissions from the use of fossil fuels. Global warming caused by increasing amounts of greenhouse gases in the atmosphere is the major environmental challenge for the 21st century. Reducing worldwide emissions of CO2 requires multiple mitigation pathways, including reductions in energy consumption, more efficient use of available energy, the application of renewable energy sources, and sequestration. Sequestration is a major tool for managing carbon emissions. In a majority of cases CO2 is viewed as waste to be disposed; however, with advanced technology, carbon sequestration can become a value-added proposition. There are a number of potential opportunities that render sequestration economically viable. In this study, we review these most economically promising opportunities and pathways of carbon sequestration, including reforestation, best agricultural production, housing and furniture, enhanced oil recovery, coalbed methane (CBM), and CO2 hydrates. Many of these terrestrial and geological sequestration opportunities are expected to provide a direct economic benefit over that obtained by merely reducing the atmospheric CO2 loading. Sequestration opportunities in 11 states of the Southeast and South Central United States are discussed. Among the most promising methods for the region include reforestation and CBM. The annual forest carbon sink in this region is estimated to be 76 Tg C/year, which would amount to an expenditure of 11.1-13.9 billion/year. Best management practices could enhance carbon sequestration by 53.9 Tg C/year, accounting for 9.3% of current total annual regional greenhouse gas emission in the next 20 years. Annual carbon storage in housing, furniture, and other wood products in 1998 was estimated to be 13.9 Tg C in the region. Other sequestration options, including the direct injection of CO2 in deep saline aquifers, mineralization, and biomineralization, are not expected to lead to direct economic gain. More detailed studies are needed for assessing the ultimate changes to the environment and the associated indirect cost savings for carbon sequestration.
Making carbon sequestration a paying proposition.
Han, Fengxiang X; Lindner, Jeff S; Wang, Chuji
2007-03-01
Atmospheric carbon dioxide (CO(2)) has increased from a preindustrial concentration of about 280 ppm to about 367 ppm at present. The increase has closely followed the increase in CO(2) emissions from the use of fossil fuels. Global warming caused by increasing amounts of greenhouse gases in the atmosphere is the major environmental challenge for the 21st century. Reducing worldwide emissions of CO(2) requires multiple mitigation pathways, including reductions in energy consumption, more efficient use of available energy, the application of renewable energy sources, and sequestration. Sequestration is a major tool for managing carbon emissions. In a majority of cases CO(2) is viewed as waste to be disposed; however, with advanced technology, carbon sequestration can become a value-added proposition. There are a number of potential opportunities that render sequestration economically viable. In this study, we review these most economically promising opportunities and pathways of carbon sequestration, including reforestation, best agricultural production, housing and furniture, enhanced oil recovery, coalbed methane (CBM), and CO(2) hydrates. Many of these terrestrial and geological sequestration opportunities are expected to provide a direct economic benefit over that obtained by merely reducing the atmospheric CO(2) loading. Sequestration opportunities in 11 states of the Southeast and South Central United States are discussed. Among the most promising methods for the region include reforestation and CBM. The annual forest carbon sink in this region is estimated to be 76 Tg C/year, which would amount to an expenditure of $11.1-13.9 billion/year. Best management practices could enhance carbon sequestration by 53.9 Tg C/year, accounting for 9.3% of current total annual regional greenhouse gas emission in the next 20 years. Annual carbon storage in housing, furniture, and other wood products in 1998 was estimated to be 13.9 Tg C in the region. Other sequestration options, including the direct injection of CO(2) in deep saline aquifers, mineralization, and biomineralization, are not expected to lead to direct economic gain. More detailed studies are needed for assessing the ultimate changes to the environment and the associated indirect cost savings for carbon sequestration.
This paper provides EPA's analysis of the data to determine carbon sequestration rates at three diverse sites that differ in geography/location, weather, soil properties, type of contamination, and age.
Charlie Byrer
2017-12-09
Terrestrial sequestration is the enhancement of CO2 uptake by plants that grow on land and in freshwater and, importantly, the enhancement of carbon storage in soils where it may remain more permanently stored. Terrestrial sequestration provides an opportunity for low-cost CO2 emissions offsets.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
NASA Astrophysics Data System (ADS)
Dalton, T. A.; Daniels, J. J.
2009-12-01
The development of geological carbon sequestration within the Ohio River Valley is of major interest to the national electricity and coal industries because the Valley is home to a heavy concentration of coal-burning electricity generation plants and the infrastructure is impossible to eliminate in the short-term. It has been determined by Ohio's politicians and citizenry that the continued use of coal in this region until alternative energy supplies are available will be necessary over the next few years. Geologic sequestration is the only possible means of keeping the CO2 out of the atmosphere in the region. The cost of the sequestration effort greatly decreases CO2 emissions by sequestering CO2 directly on site of these plants, or by minimizing the distance between fossil-fueled generation and sequestration (i.e., by eliminating the cost of transportation of supercritical CO2 from plant to sequestration site). Thus, the practicality of CO2 geologic sequestration within the Ohio River Valley is central to the development of such a commercial effort. Though extensive work has been done by the Regional Partnerships of the DOE/NETL in the characterization of general areas for carbon sequestration throughout the nation, few projects have narrowed their focus into a single geologic region in order to evaluate the sites of greatest commercial potential. As an undergraduate of the Earth Sciences at Ohio State, I have engaged in thorough research to obtain a detailed understanding of the geology of the Ohio River Valley and its potential for commercial-scale carbon sequestration. Through this research, I have been able to offer an estimate of the areas of greatest interest for CO2 geologic sequestration. This research has involved petrological, mineralogical, geochemical, and geophysical analyses of four major reservoir formations within Ohio—the Rose Run, the Copper Ridge, the Clinton, and the Oriskany—along with an evaluation of the possible effects of injection into these saline reservoirs.
Software Validation via Model Animation
NASA Technical Reports Server (NTRS)
Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.
2015-01-01
This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.
Software Quality Control at Belle II
NASA Astrophysics Data System (ADS)
Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.;
2017-10-01
Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.
Hardware and software improvements to a low-cost horizontal parallax holographic video monitor.
Henrie, Andrew; Codling, Jesse R; Gneiting, Scott; Christensen, Justin B; Awerkamp, Parker; Burdette, Mark J; Smalley, Daniel E
2018-01-01
Displays capable of true holographic video have been prohibitively expensive and difficult to build. With this paper, we present a suite of modularized hardware components and software tools needed to build a HoloMonitor with basic "hacker-space" equipment, highlighting improvements that have enabled the total materials cost to fall to $820, well below that of other holographic displays. It is our hope that the current level of simplicity, development, design flexibility, and documentation will enable the lay engineer, programmer, and scientist to relatively easily replicate, modify, and build upon our designs, bringing true holographic video to the masses.
Benchmark Dose Software (BMDS) Development and ...
This report is intended to provide an overview of beta version 1.0 of the implementation of a model of repeated measures data referred to as the Toxicodiffusion model. The implementation described here represents the first steps towards integration of the Toxicodiffusion model into the EPA benchmark dose software (BMDS). This version runs from within BMDS 2.0 using an option screen for making model selection, as is done for other models in the BMDS 2.0 suite. This report is intended to provide an overview of beta version 1.0 of the implementation of a model of repeated measures data referred to as the Toxicodiffusion model.
NASA Technical Reports Server (NTRS)
Stehura, Aaron; Rozek, Matthew
2013-01-01
The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.
NASA Technical Reports Server (NTRS)
2000-01-01
Automated Analysis Corporation's COMET is a suite of acoustic analysis software for advanced noise prediction. It analyzes the origin, radiation, and scattering of noise, and supplies information on how to achieve noise reduction and improve sound characteristics. COMET's Structural Acoustic Foam Engineering (SAFE) module extends the sound field analysis capability of foam and other materials. SAFE shows how noise travels while airborne, how it travels within a structure, and how these media interact to affect other aspects of the transmission of noise. The COMET software reduces design time and expense while optimizing a final product's acoustical performance. COMET was developed through SBIR funding and Langley Research Center for Automated Analysis Corporation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cramer, Christopher J.
Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
A Data Model Framework for the Characterization of a Satellite Data Handling Software
NASA Astrophysics Data System (ADS)
Camatto, Gianluigi; Tipaldi, Massimo; Bothmer, Wolfgang; Ferraguto, Massimo; Bruenjes, Bernhard
2014-08-01
This paper describes an approach for the modelling of the characterization and configuration data yielded when developing a Satellite Data Handling Software (DHSW). The model can then be used as an input for the preparation of the logical and physical representation of the Satellite Reference Database (SRDB) contents and related SW suite, an essential product that allows transferring the information between the different system stakeholders, but also to produce part of the DHSW documentation and artefacts. Special attention is given to the shaping of the general Parameter concept, which is shared by a number of different entities within a Space System.
Upgrade of DRAMA-ESA's Space Debris Mitigation Analysis Tool Suite
NASA Astrophysics Data System (ADS)
Gelhaus, Johannes; Sanchez-Ortiz, Noelia; Braun, Vitali; Kebschull, Christopher; de Oliveira, Joaquim Correia; Dominguez-Gonzalez, Raul; Wiedemann, Carsten; Krag, Holger; Vorsmann, Peter
2013-08-01
One decade ago ESA started the dev elopment of the first version of the software tool called DRAMA (Debris Risk Assessment and Mitigation Analysis) to enable ESA space programs to assess their compliance with the recommendations in the European Code of Conduct for Space Debris Mitigation. This tool was maintained, upgraded and extended during the last year and is now a combination of five individual tools, each addressing a different aspect of debris mitigation. This paper gives an overview of the new DRAMA software in general. Both, the main tools ARES, OSCAR, MIDAS, CROC and SARA will be discussed and the environment used by DRAMA will be explained shortly.
A guide to the visual analysis and communication of biomolecular structural data.
Johnson, Graham T; Hertig, Samuel
2014-10-01
Biologists regularly face an increasingly difficult task - to effectively communicate bigger and more complex structural data using an ever-expanding suite of visualization tools. Whether presenting results to peers or educating an outreach audience, a scientist can achieve maximal impact with minimal production time by systematically identifying an audience's needs, planning solutions from a variety of visual communication techniques and then applying the most appropriate software tools. A guide to available resources that range from software tools to professional illustrators can help researchers to generate better figures and presentations tailored to any audience's needs, and enable artistically inclined scientists to create captivating outreach imagery.
Artificial Intelligence Software for Assessing Postural Stability
NASA Technical Reports Server (NTRS)
Lieberman, Erez; Forth, Katharine; Paloski, William
2013-01-01
A software package reads and analyzes pressure distributions from sensors mounted under a person's feet. Pressure data from sensors mounted in shoes, or in a platform, can be used to provide a description of postural stability (assessing competence to deficiency) and enables the determination of the person's present activity (running, walking, squatting, falling). This package has three parts: a preprocessing algorithm for reading input from pressure sensors; a Hidden Markov Model (HMM), which is used to determine the person's present activity and level of sensing-motor competence; and a suite of graphical algorithms, which allows visual representation of the person's activity and vestibular function over time.
Yang, Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu, Yu; Goddu, S Murty; Mutic, Sasa; Deasy, Joseph O; Low, Daniel A
2011-01-01
Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research. 0 2011 Ameri-
Physical and Biological Regulation of Carbon Sequestration in Tidal Marshes
NASA Astrophysics Data System (ADS)
Morris, J. T.; Callaway, J.
2017-12-01
The rate of carbon sequestration in tidal marshes is regulated by complex feedbacks among biological and physical factors including the rate of sea-level rise (SLR), biomass production, tidal amplitude, and the concentration of suspended sediment. We used the Marsh Equilibrium Model (MEM) to explore the effects on C-sequestration across a wide range of permutations of these variables. C-sequestration increased with the rate of SLR to a maximum, then down to a vanishing point at higher SLR when marshes convert to mudflats. An acceleration in SLR will increase C-sequestration in marshes that can keep pace, but at high rates of SLR this is only possible with high biomass and suspended sediment concentrations. We found that there were no feasible solutions at SLR >13 mm/yr for permutations of variables that characterize the great majority of tidal marshes, i.e., the equilibrium elevation exists below the lower vertical limit for survival of marsh vegetation. The rate of SLR resulting in maximum C-sequestration varies with biomass production. C-sequestration rates at SLR=1 mm/yr averaged only 36 g C m-2 yr-1, but at the highest maximum biomass tested (5000 g/m2) the mean C-sequestration reached 399 g C m-2 yr-1 at SLR = 14 mm/yr. The empirical estimate of C-sequestration in a core dated 50-years overestimates the theoretical long-term rate by 34% for realistic values of decomposition rate and belowground production. The overestimate of the empirical method arises from the live and decaying biomass contained within the carbon inventory above the marker horizon, and overestimates were even greater for shorter surface cores.
Trade-based carbon sequestration accounting.
King, Dennis M
2004-04-01
This article describes and illustrates an accounting method to assess and compare "early" carbon sequestration investments and trades on the basis of the number of standardized CO2 emission offset credits they will provide. The "gold standard" for such credits is assumed to be a relatively riskless credit based on a CO2 emission reduction that provides offsets against CO2 emissions on a one-for-one basis. The number of credits associated with carbon sequestration needs to account for time, risk, durability, permanence, additionality, and other factors that future trade regulators will most certainly use to assign "official" credits to sequestration projects. The method that is presented here uses established principles of natural resource accounting and conventional rules of asset valuation to "score" projects. A review of 20 "early" voluntary United States based CO2 offset trades that involve carbon sequestration reveals that the assumptions that buyers, sellers, brokers, and traders are using to characterize the economic potential of their investments and trades vary enormously. The article develops a "universal carbon sequestration credit scoring equation" and uses two of these trades to illustrate the sensitivity of trade outcomes to various assumptions about how future trade auditors are likely to "score" carbon sequestration projects in terms of their "equivalency" with CO2 emission reductions. The article emphasizes the importance of using a standard credit scoring method that accounts for time and risk to assess and compare even unofficial prototype carbon sequestration trades. The scoring method illustrated in this article is a tool that can protect the integrity of carbon sequestration credit trading and can assist buyers and sellers in evaluating the real economic potential of prospective trades.
Dynamics and climate change mitigation potential of soil organic carbon sequestration.
Sommer, Rolf; Bossio, Deborah
2014-11-01
When assessing soil organic carbon (SOC) sequestration and its climate change (CC) mitigation potential at global scale, the dynamic nature of soil carbon storage and interventions to foster it should be taken into account. Firstly, adoption of SOC-sequestration measures will take time, and reasonably such schemes could only be implemented gradually at large-scale. Secondly, if soils are managed as carbon sinks, then SOC will increase only over a limited time, up to the point when a new SOC equilibrium is reached. This paper combines these two processes and predicts potential SOC sequestration dynamics in agricultural land at global scale and the corresponding CC mitigation potential. Assuming that global governments would agree on a worldwide effort to gradually change land use practices towards turning agricultural soils into carbon sinks starting 2014, the projected 87-year (2014-2100) global SOC sequestration potential of agricultural land ranged between 31 and 64 Gt. This is equal to 1.9-3.9% of the SRES-A2 projected 87-year anthropogenic emissions. SOC sequestration would peak 2032-33, at that time reaching 4.3-8.9% of the projected annual SRES-A2 emission. About 30 years later the sequestration rate would have reduced by half. Thus, SOC sequestration is not a C wedge that could contribute increasingly to mitigating CC. Rather, the mitigation potential is limited, contributing very little to solving the climate problem of the coming decades. However, we deliberately did not elaborate on the importance of maintaining or increasing SOC for sustaining soil health, agro-ecosystem functioning and productivity; an issue of global significance that deserves proper consideration irrespectively of any potential additional sequestration of SOC. Copyright © 2014 Elsevier Ltd. All rights reserved.
Near-term deployment of carbon capture and sequestration from biorefineries in the United States.
Sanchez, Daniel L; Johnson, Nils; McCoy, Sean T; Turner, Peter A; Mach, Katharine J
2018-05-08
Capture and permanent geologic sequestration of biogenic CO 2 emissions may provide critical flexibility in ambitious climate change mitigation. However, most bioenergy with carbon capture and sequestration (BECCS) technologies are technically immature or commercially unavailable. Here, we evaluate low-cost, commercially ready CO 2 capture opportunities for existing ethanol biorefineries in the United States. The analysis combines process engineering, spatial optimization, and lifecycle assessment to consider the technical, economic, and institutional feasibility of near-term carbon capture and sequestration (CCS). Our modeling framework evaluates least cost source-sink relationships and aggregation opportunities for pipeline transport, which can cost-effectively transport small CO 2 volumes to suitable sequestration sites; 216 existing US biorefineries emit 45 Mt CO 2 annually from fermentation, of which 60% could be captured and compressed for pipeline transport for under $25/tCO 2 A sequestration credit, analogous to existing CCS tax credits, of $60/tCO 2 could incent 30 Mt of sequestration and 6,900 km of pipeline infrastructure across the United States. Similarly, a carbon abatement credit, analogous to existing tradeable CO 2 credits, of $90/tCO 2 can incent 38 Mt of abatement. Aggregation of CO 2 sources enables cost-effective long-distance pipeline transport to distant sequestration sites. Financial incentives under the low-carbon fuel standard in California and recent revisions to existing federal tax credits suggest a substantial near-term opportunity to permanently sequester biogenic CO 2 This financial opportunity could catalyze the growth of carbon capture, transport, and sequestration; improve the lifecycle impacts of conventional biofuels; support development of carbon-negative fuels; and help fulfill the mandates of low-carbon fuel policies across the United States. Copyright © 2018 the Author(s). Published by PNAS.
Near-term deployment of carbon capture and sequestration from biorefineries in the United States
Johnson, Nils; McCoy, Sean T.; Turner, Peter A.; Mach, Katharine J.
2018-01-01
Capture and permanent geologic sequestration of biogenic CO2 emissions may provide critical flexibility in ambitious climate change mitigation. However, most bioenergy with carbon capture and sequestration (BECCS) technologies are technically immature or commercially unavailable. Here, we evaluate low-cost, commercially ready CO2 capture opportunities for existing ethanol biorefineries in the United States. The analysis combines process engineering, spatial optimization, and lifecycle assessment to consider the technical, economic, and institutional feasibility of near-term carbon capture and sequestration (CCS). Our modeling framework evaluates least cost source–sink relationships and aggregation opportunities for pipeline transport, which can cost-effectively transport small CO2 volumes to suitable sequestration sites; 216 existing US biorefineries emit 45 Mt CO2 annually from fermentation, of which 60% could be captured and compressed for pipeline transport for under $25/tCO2. A sequestration credit, analogous to existing CCS tax credits, of $60/tCO2 could incent 30 Mt of sequestration and 6,900 km of pipeline infrastructure across the United States. Similarly, a carbon abatement credit, analogous to existing tradeable CO2 credits, of $90/tCO2 can incent 38 Mt of abatement. Aggregation of CO2 sources enables cost-effective long-distance pipeline transport to distant sequestration sites. Financial incentives under the low-carbon fuel standard in California and recent revisions to existing federal tax credits suggest a substantial near-term opportunity to permanently sequester biogenic CO2. This financial opportunity could catalyze the growth of carbon capture, transport, and sequestration; improve the lifecycle impacts of conventional biofuels; support development of carbon-negative fuels; and help fulfill the mandates of low-carbon fuel policies across the United States. PMID:29686063
Software Framework for Controlling Unsupervised Scientific Instruments.
Schmid, Benjamin; Jahr, Wiebke; Weber, Michael; Huisken, Jan
2016-01-01
Science outreach and communication are gaining more and more importance for conveying the meaning of today's research to the general public. Public exhibitions of scientific instruments can provide hands-on experience with technical advances and their applications in the life sciences. The software of such devices, however, is oftentimes not appropriate for this purpose. In this study, we describe a software framework and the necessary computer configuration that is well suited for exposing a complex self-built and software-controlled instrument such as a microscope to laymen under limited supervision, e.g. in museums or schools. We identify several aspects that must be met by such software, and we describe a design that can simultaneously be used to control either (i) a fully functional instrument in a robust and fail-safe manner, (ii) an instrument that has low-cost or only partially working hardware attached for illustration purposes or (iii) a completely virtual instrument without hardware attached. We describe how to assess the educational success of such a device, how to monitor its operation and how to facilitate its maintenance. The introduced concepts are illustrated using our software to control eduSPIM, a fluorescent light sheet microscope that we are currently exhibiting in a technical museum.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCorkle, D.; Yang, C.; Jordan, T.
2007-06-01
Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less
Functional Mobility Testing: A Novel Method to Create Suit Design Requirements
NASA Technical Reports Server (NTRS)
England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.
2008-01-01
This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.
36 CFR 230.40 - Eligible practices for cost-share assistance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... regeneration or to ensure forest establishment and carbon sequestration. (3) Forest Stand Improvement—Practices to enhance growth and quality of wood fiber, special forest products, and carbon sequestration. (4... carbon sequestration in conjunction with agriculture, forest, and other land uses. (5) Water Quality...
Carbon sequestration and its role in the global carbon cycle
McPherson, Brian J.; Sundquist, Eric T.
2009-01-01
For carbon sequestration the issues of monitoring, risk assessment, and verification of carbon content and storage efficacy are perhaps the most uncertain. Yet these issues are also the most critical challenges facing the broader context of carbon sequestration as a means for addressing climate change. In response to these challenges, Carbon Sequestration and Its Role in the Global Carbon Cycle presents current perspectives and research that combine five major areas: • The global carbon cycle and verification and assessment of global carbon sources and sinks • Potential capacity and temporal/spatial scales of terrestrial, oceanic, and geologic carbon storage • Assessing risks and benefits associated with terrestrial, oceanic, and geologic carbon storage • Predicting, monitoring, and verifying effectiveness of different forms of carbon storage • Suggested new CO2 sequestration research and management paradigms for the future. The volume is based on a Chapman Conference and will appeal to the rapidly growing group of scientists and engineers examining methods for deliberate carbon sequestration through storage in plants, soils, the oceans, and geological repositories.
Faster Aerodynamic Simulation With Cart3D
NASA Technical Reports Server (NTRS)
2003-01-01
A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.
NASA Astrophysics Data System (ADS)
Wasser, Avi; Lincoln, Maya
In recent years, both practitioners and applied researchers have become increasingly interested in methods for integrating business process models and enterprise software systems through the deployment of enabling middleware. Integrative BPM research has been mainly focusing on the conversion of workflow notations into enacted application procedures, and less effort has been invested in enhancing the connectivity between design level, non-workflow business process models and related enactment systems such as: ERP, SCM and CRM. This type of integration is useful at several stages of an IT system lifecycle, from design and implementation through change management, upgrades and rollout. The paper presents an integration method that utilizes SOA for connecting business process models with corresponding enterprise software systems. The method is then demonstrated through an Oracle E-Business Suite procurement process and its ERP transactions.
Ffuzz: Towards full system high coverage fuzz testing on binary executables
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool—Ffuzz—on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently. PMID:29791469
Production roll out plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, D.E.
The Hanford Data Integration 2000 (HANDI 2000) Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract (PHMC). It is based on the Commercial-Off-The-Shelf (COTS) product solution with commercially proven business processes. The COTS product solution set, of Passport (PP) and PeopleSoft (PS) software, supports finance, supply, human resources, and payroll activities under the current PHMC direction. The PP software is an integrated application for Accounts Payable, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheets (MSDS). The PS software is an integrated application for Projects,more » General Ledger, Human Resources Training, Payroll, and Base Benefits. This set of software constitutes the Business Management System (BMS) and MSDS, a subset of the HANDI 2000 suite of systems. The primary objective of the Production Roll Out Plan is to communicate the methods and schedules for implementation and roll out to end users of BMS.« less
Software and languages for microprocessors
NASA Astrophysics Data System (ADS)
Williams, David O.
1986-08-01
This paper forms the basis for lectures given at the 6th Summer School on Computing Techniques in Physics, organised by the Computational Physics group of the European Physics Society, and held at the Hotel Ski, Nové Město na Moravě, Czechoslovakia, on 17-26 September 1985. Various types of microprocessor applications are discussed and the main emphasis of the paper is devoted to 'embedded' systems, where the software development is not carried out on the target microprocessor. Some information is provided on the general characteristics of microprocessor hardware. Various types of microprocessor operating system are compared and contrasted. The selection of appropriate languages and software environments for use with microprocessors is discussed. Mechanisms for interworking between different languages, including reasonable error handling, are treated. The CERN developed cross-software suite for the Motorola 68000 family is described. Some remarks are made concerning program tools applicable to microprocessors. PILS, a Portable Interactive Language System, which can be interpreted or compiled for a range of microprocessors, is described in some detail, and the implementation techniques are discussed.
Das, Narendra; Stampoulis, Dimitrios; Ines, Amor; Fisher, Joshua B.; Granger, Stephanie; Kawata, Jessie; Han, Eunjin; Behrangi, Ali
2017-01-01
The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications. A spatially-enabled database is a key component of the software that can ingest a suite of satellite and model datasets while facilitating the interfacing with Geographic Information System (GIS) applications. The datasets ingested are obtained from numerous space-borne sensors and represent multiple components of the water cycle. The object-oriented design of the software allows for modularity and extensibility, showcased here with the coupling of the core hydrologic model with a crop growth model. RHEAS can exploit multi-threading to scale with increasing number of processors, while the database allows delivery of data products and associated uncertainty through a variety of GIS platforms. A set of three example implementations of RHEAS in the United States and Kenya are described to demonstrate the different features of the system in real-world applications. PMID:28545077
Andreadis, Konstantinos M; Das, Narendra; Stampoulis, Dimitrios; Ines, Amor; Fisher, Joshua B; Granger, Stephanie; Kawata, Jessie; Han, Eunjin; Behrangi, Ali
2017-01-01
The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications. A spatially-enabled database is a key component of the software that can ingest a suite of satellite and model datasets while facilitating the interfacing with Geographic Information System (GIS) applications. The datasets ingested are obtained from numerous space-borne sensors and represent multiple components of the water cycle. The object-oriented design of the software allows for modularity and extensibility, showcased here with the coupling of the core hydrologic model with a crop growth model. RHEAS can exploit multi-threading to scale with increasing number of processors, while the database allows delivery of data products and associated uncertainty through a variety of GIS platforms. A set of three example implementations of RHEAS in the United States and Kenya are described to demonstrate the different features of the system in real-world applications.
Development of an e-VLBI Data Transport Software Suite with VDIF
NASA Technical Reports Server (NTRS)
Sekido, Mamoru; Takefuji, Kazuhiro; Kimura, Moritaka; Hobiger, Thomas; Kokado, Kensuke; Nozawa, Kentarou; Kurihara, Shinobu; Shinno, Takuya; Takahashi, Fujinobu
2010-01-01
We have developed a software library (KVTP-lib) for VLBI data transmission over the network with the VDIF (VLBI Data Interchange Format), which is the newly proposed standard VLBI data format designed for electronic data transfer over the network. The software package keeps the application layer (VDIF frame) and the transmission layer separate, so that each layer can be developed efficiently. The real-time VLBI data transmission tool sudp-send is an application tool based on the KVTP-lib library. sudp-send captures the VLBI data stream from the VSI-H interface with the K5/VSI PC-board and writes the data to file in standard Linux file format or transmits it to the network using the simple- UDP (SUDP) protocol. Another tool, sudp-recv , receives the data stream from the network and writes the data to file in a specific VLBI format (K5/VSSP, VDIF, or Mark 5B). This software system has been implemented on the Wettzell Tsukuba baseline; evaluation before operational employment is under way.
Validation of software for calculating the likelihood ratio for parentage and kinship.
Drábek, J
2009-03-01
Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.
De Biase, Pablo M; Markosyan, Suren; Noskov, Sergei
2015-02-05
The transport of ions and solutes by biological pores is central for cellular processes and has a variety of applications in modern biotechnology. The time scale involved in the polymer transport across a nanopore is beyond the accessibility of conventional MD simulations. Moreover, experimental studies lack sufficient resolution to provide details on the molecular underpinning of the transport mechanisms. BROMOC, the code presented herein, performs Brownian dynamics simulations, both serial and parallel, up to several milliseconds long. BROMOC can be used to model large biological systems. IMC-MACRO software allows for the development of effective potentials for solute-ion interactions based on radial distribution function from all-atom MD. BROMOC Suite also provides a versatile set of tools to do a wide variety of preprocessing and postsimulation analysis. We illustrate a potential application with ion and ssDNA transport in MspA nanopore. © 2014 Wiley Periodicals, Inc.
Algorithm and Architecture Independent Benchmarking with SEAK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallent, Nathan R.; Manzano Franco, Joseph B.; Gawande, Nitin A.
2016-05-23
Many applications of high performance embedded computing are limited by performance or power bottlenecks. We have designed the Suite for Embedded Applications & Kernels (SEAK), a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions; and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) and goal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user black-box evaluation that can capture tradeoffs between performance, power, accuracy, size, andmore » weight. The tradeoffs are especially informative for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.« less
The Suite for Embedded Applications and Kernels
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-05-10
Many applications of high performance embedded computing are limited by performance or power bottlenecks. We havedesigned SEAK, a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions to these bottlenecks? and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) andgoal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user blackbox evaluation that can capture tradeoffs between performance, power, accuracy, size, and weight. The tradeoffs are especially informativemore » for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.« less
NASA Astrophysics Data System (ADS)
Erhart, Eva; Schmid, Harald; Hülsbergen, Kurt-Jürgen; Hartl, Wilfried
2015-04-01
Humus and energy balances and greenhouse gas emissions with compost fertilization in organic farming compared with mineral fertilization E. Erhart, H. Schmid, K.-J. Hülsbergen, W. Hartl The positive effects of compost fertilization on soil humus with their associated benefits for soil quality are well-established. The aim of the present study was to assess the effect of compost fertilization on humus and energy balances and greenhouse gas emissions and to compare the results of the humus balances with the changes in soil organic carbon contents measured in the soil of the experimental field. In order to assess the effects of compost use in organic farming as compared to conventional farming practice using mineral fertilizers, the field experiment with compost fertilization 'STIKO' was set up in 1992 near Vienna, Austria, on a Molli-gleyic Fluvisol. It included three treatments with compost fertilization (C1, C2 and C3 with 8, 14 and 20 t ha-1 y-1 f. m. on average of 14 years), three treatments with mineral nitrogen fertilization (N1, N2 and N3 with 29, 46 and 63 kg N ha-1 y 1 on average) and an unfertilized control (0) in six replications in a latin rectangle design. In the field trial, biowaste compost from the composting plant of the City of Vienna was used. Data from the field experiment (from 14 experimental years) were fed into the model software REPRO to calculate humus and energy balances and greenhouse gas emissions. The model software REPRO (REPROduction of soil fertility) couples the balancing of C, N and energy fluxes. For the determination of the net greenhouse effect, REPRO performs calculations of C sequestration in the soil, CO2 emissions from the use of fossil energy and N2O emissions from the soil. Humus balances showed that compost fertilization at a rate of 8 t ha-1 y-1 (C1) resulted in a positive humus balance of +115 kg C ha-1 y-1. With 14 and 20 t ha-1 y-1 compost (C2 and C3), respectively, humus accumulated at rates of 558 and 1021 kg C ha-1 y-1. With mineral fertilization at rates of 29 - 63 kg N ha-1 y-1 (N1 - N3), balances were moderately negative ( 169 to -227 kg C ha-1 y-1), while a clear humus deficit of 457 kg C ha-1 y-1 showed in the unfertilized control. Compared with measured soil organic carbon data REPRO predicted soil organic carbon contents fairly well with the exception of the treatments with high compost rates. Here REPRO clearly overestimated soil organic carbon contents for this site. Energy efficiency, as described by the output/input ratio, was highest in the control, followed by C1. Mineral fertilization treatment N3 was most energy intensive. The greenhouse gas balance indicated net carbon sequestration already with medium compost rates (C2), and net carbon sequestration of 1700 kg CO2-eq ha-1 y-1 in C3. Mineral fertilization yielded net greenhouse gas emissions of around 2000 kg CO2-eq ha-1 y 1. The highest greenhouse gas emissions had the unfertilized control due to the degradation of soil organic matter and lowest organic matter input. These findings underline that compost fertilization holds a high potential for carbon sequestration and for the reduction of greenhouse gas emissions.
Commanding and Controlling Satellite Clusters (IEEE Intelligent Systems, November/December 2000)
2000-01-01
real - time operating system , a message-passing OS well suited for distributed...ground Flight processors ObjectAgent RTOS SCL RTOS RDMS Space command language Real - time operating system Rational database management system TS-21 RDMS...engineer with Princeton Satellite Systems. She is working with others to develop ObjectAgent software to run on the OSE Real Time Operating System .
ERIC Educational Resources Information Center
Coleman, Mari Beth; Cherry, Rebecca A.; Moore, Tara C.; Yujeong, Park; Cihak, David F.
2015-01-01
The purpose of this study was to compare the effects of teacher-directed simultaneous prompting to computer-assisted simultaneous prompting for teaching sight words to 3 elementary school students with intellectual disability. Activities in the computer-assisted condition were designed with Intellitools Classroom Suite software whereas traditional…
Review of Winograd and Flores’ Understanding Computers and Cognition: A Favorable Interpretation.
1986-07-01
very little basis for determni ong the practical Iitin its of forma I’lIiat ionl, particula rly for appl ications of Ar i fwcial inelhIreiwe ito...AlexanJra VA 22314 KAJ Software, Inc. Dr. David -. Weiss East Shea Blvd. NC60 Elliott Hall Dr. Derek Sleeman Suite 161 University of Minnesota Starfor
Teaching the Teacher: Tutoring SimStudent Leads to More Effective Cognitive Tutor Authoring
ERIC Educational Resources Information Center
Matsuda, Noboru; Cohen, William W.; Koedinger, Kenneth R.
2015-01-01
SimStudent is a machine-learning agent initially developed to help novice authors to create cognitive tutors without heavy programming. Integrated into an existing suite of software tools called Cognitive Tutor Authoring Tools (CTAT), SimStudent helps authors to create an expert model for a cognitive tutor by tutoring SimStudent on how to solve…
Framework for Flexible Security in Group Communications
NASA Technical Reports Server (NTRS)
McDaniel, Patrick; Prakash, Atul
2006-01-01
The Antigone software system defines a framework for the flexible definition and implementation of security policies in group communication systems. Antigone does not dictate the available security policies, but provides high-level mechanisms for implementing them. A central element of the Antigone architecture is a suite of such mechanisms comprising micro-protocols that provide the basic services needed by secure groups.
An Investigation of Alerting and Prioritization Criteria for Sense and Avoid (SAA)
2013-10-01
TECHNICAL REPORT RDMR-TM-13-01 AN INVESTIGATION OF ALERTING AND PRIORITIZATION CRITERIA FOR SENSE AND AVOID ( SAA ...OFFICIAL ENDORSEMENT OR APPROVAL OF THE USE OF SUCH COMMERCIAL HARDWARE OR SOFTWARE . i/ii (Blank) REPORT DOCUMENTATION PAGE Form Approved...burden to Washington Headquarters Services , Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington
Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
75 FR 33613 - Notice of the Carbon Sequestration-Geothermal Energy-Science Joint Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-14
... Sequestration--Geothermal Energy--Science Joint Workshop AGENCY: Office of Energy Efficiency and Renewable Energy, DOE. ACTION: Notice of the Carbon Sequestration--Geothermal Energy--Science Joint Workshop... Carbon Storage and Geothermal Energy, June 15-16, 2010. Experts from industry, academia, national labs...
NASA Astrophysics Data System (ADS)
Xue, L.; Firl, G.; Zhang, M.; Jimenez, P. A.; Gill, D.; Carson, L.; Bernardet, L.; Brown, T.; Dudhia, J.; Nance, L. B.; Stark, D. R.
2017-12-01
The Global Model Test Bed (GMTB) has been established to support the evolution of atmospheric physical parameterizations in NCEP global modeling applications. To accelerate the transition to the Next Generation Global Prediction System (NGGPS), a collaborative model development framework known as the Common Community Physics Package (CCPP) is created within the GMTB to facilitate engagement from the broad community on physics experimentation and development. A key component to this Research to Operation (R2O) software framework is the Interoperable Physics Driver (IPD) that hooks the physics parameterizations from one end to the dynamical cores on the other end with minimum implementation effort. To initiate the CCPP, scientists and engineers from the GMTB separated and refactored the GFS physics. This exercise demonstrated the process of creating IPD-compliant code and can serve as an example for other physics schemes to do the same and be considered for inclusion into the CCPP. Further benefits to this process include run-time physics suite configuration and considerably reduced effort for testing modifications to physics suites through GMTB's physics test harness. The implementation will be described and the preliminary results will be presented at the conference.
CORE (Common Operating Response Environment) Software Technology Suite
Gelston, Gariann; Rohlfing, Kerrie
2018-05-30
Agencies that oversee complex, multi-stakeholder programs need efficient, secure ways to link people and knowledge within and across organizations. The Common Operating Response Environment (CORE), a software suite developed by PNNL researchers does just that. The CORE toolâwhich is customizable for a multitude of usesâfacilitates situational awareness by integrating diverse data streams without the need to reformat them, summarizing that information, and providing users with the information they need to rapidly understand and appropriately respond to situations. It is mobile device-ready, has a straightforward interface for ease of use across organizations and skill sets, and is incredibly configurable to the needs of each specific user, whether they require data summaries for high-level decision makers or tactical maps, operational data, or weather information for responders in the field. Information can be input into CORE and queried in a variety of waysâusing customized forms, reports, visuals, or other organizational templatesâaccording to the needs of each userâs organization, teams, and business processes. CORE data forms, for instance, could be accessed and used in real-time to capture information about vessels being inspected for nuclear material.
InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.
Schenkelberg, Christian D; Bystroff, Christopher
2015-12-15
Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Božičević, Alen; Dobrzyński, Maciej; De Bie, Hans; Gafner, Frank; Garo, Eliane; Hamburger, Matthias
2017-12-05
The technological development of LC-MS instrumentation has led to significant improvements of performance and sensitivity, enabling high-throughput analysis of complex samples, such as plant extracts. Most software suites allow preprocessing of LC-MS chromatograms to obtain comprehensive information on single constituents. However, more advanced processing needs, such as the systematic and unbiased comparative metabolite profiling of large numbers of complex LC-MS chromatograms remains a challenge. Currently, users have to rely on different tools to perform such data analyses. We developed a two-step protocol comprising a comparative metabolite profiling tool integrated in ACD/MS Workbook Suite, and a web platform developed in R language designed for clustering and visualization of chromatographic data. Initially, all relevant chromatographic and spectroscopic data (retention time, molecular ions with the respective ion abundance, and sample names) are automatically extracted and assembled in an Excel spreadsheet. The file is then loaded into an online web application that includes various statistical algorithms and provides the user with tools to compare and visualize the results in intuitive 2D heatmaps. We applied this workflow to LC-ESIMS profiles obtained from 69 honey samples. Within few hours of calculation with a standard PC, honey samples were preprocessed and organized in clusters based on their metabolite profile similarities, thereby highlighting the common metabolite patterns and distributions among samples. Implementation in the ACD/Laboratories software package enables ulterior integration of other analytical data, and in silico prediction tools for modern drug discovery.
Greenwald, William W; Li, He; Smith, Erin N; Benaglio, Paola; Nariai, Naoki; Frazer, Kelly A
2017-04-07
Genomic interaction studies use next-generation sequencing (NGS) to examine the interactions between two loci on the genome, with subsequent bioinformatics analyses typically including annotation, intersection, and merging of data from multiple experiments. While many file types and analysis tools exist for storing and manipulating single locus NGS data, there is currently no file standard or analysis tool suite for manipulating and storing paired-genomic-loci: the data type resulting from "genomic interaction" studies. As genomic interaction sequencing data are becoming prevalent, a standard file format and tools for working with these data conveniently and efficiently are needed. This article details a file standard and novel software tool suite for working with paired-genomic-loci data. We present the paired-genomic-loci (PGL) file standard for genomic-interactions data, and the accompanying analysis tool suite "pgltools": a cross platform, pypy compatible python package available both as an easy-to-use UNIX package, and as a python module, for integration into pipelines of paired-genomic-loci analyses. Pgltools is a freely available, open source tool suite for manipulating paired-genomic-loci data. Source code, an in-depth manual, and a tutorial are available publicly at www.github.com/billgreenwald/pgltools , and a python module of the operations can be installed from PyPI via the PyGLtools module.
The ESA's Space Trajectory Analysis software suite
NASA Astrophysics Data System (ADS)
Ortega, Guillermo
The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and relationship between objects in 2D and 3D formats, etc. Further, the article explains that the STA development is open source and it is based on the state of the art astrodynamics routines that are grouped into modules. The modules are programmed using the C++ language. The different STA modules are designed, developed, tested and verified by the different Universities. Software integration and overall validation is performed by ESA. Students are chosen to work in STA modules as part of their Master or PhD thesis programs. As part of their growing experience, the students learn how to write documentation for a space project using European Coorperation on Space Standardization (ECSS) standards, how to test and verify the software modules they write and, how to interact with ESA and each other in this process. Finally, the article concludes about the benefits of the STA initiative. The STA project allows a strong link among applied mathematics, space engineering, and informatics disciplines by reinforcing the academic community with requirements and needs coming from space agencies and industry real needs and missions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
... Charles Carbon Capture and Sequestration Project, Lake Charles, LA AGENCY: Department of Energy. ACTION... competitive process under the Industrial Carbon Capture and Sequestration (ICCS) Program. The Lake Charles Carbon Capture and Sequestration Project (Lake Charles CCS Project) would demonstrate: (1) advanced...
None
2017-12-09
NETL's Carbon Sequestration Program is helping to develop technologies to capture, purify, and store carbon dioxide (CO2) in order to reduce greenhouse gas emissions without adversely influencing energy use or hindering economic growth. Carbon sequestration technologies capture and store CO2 that would otherwise reside in the atmosphere for long periods of time.
Carbon dynamics and sequestration in urban turfgrass ecosystems
USDA-ARS?s Scientific Manuscript database
Urbanization is a global trend. Turfgrass covers 1.9% of land in the continental US. Here we review existing literature associated with carbon (C) pools, sequestration, and nitrous oxide emission of urban turfgrass ecosystems. Turfgrasses exhibit significant carbon sequestration (0.34–1.4 Mg ha-1 ye...
From sink to source: Regional variation in U.S. forest carbon futures
Wear, David N.; Coulston, John W.
2015-01-01
The sequestration of atmospheric carbon (C) in forests has partially offset C emissions in the United States (US) and might reduce overall costs of achieving emission targets, especially while transportation and energy sectors are transitioning to lower-carbon technologies. Using detailed forest inventory data for the conterminous US, we estimate forests’ current net sequestration of atmospheric C to be 173 Tg yr−1, offsetting 9.7% of C emissions from transportation and energy sources. Accounting for multiple driving variables, we project a gradual decline in the forest C emission sink over the next 25 years (to 112 Tg yr−1) with regional differences. Sequestration in eastern regions declines gradually while sequestration in the Rocky Mountain region declines rapidly and could become a source of atmospheric C due to disturbances such as fire and insect epidemics. C sequestration in the Pacific Coast region stabilizes as forests harvested in previous decades regrow. Scenarios simulating climate-induced productivity enhancement and afforestation policies increase sequestration rates, but would not fully offset declines from aging and forest disturbances. Separating C transfers associated with land use changes from sequestration clarifies forests’ role in reducing net emissions and demonstrates that retention of forest land is crucial for protecting or enhancing sink strength. PMID:26558439
From sink to source: Regional variation in U.S. forest carbon futures.
Wear, David N; Coulston, John W
2015-11-12
The sequestration of atmospheric carbon (C) in forests has partially offset C emissions in the United States (US) and might reduce overall costs of achieving emission targets, especially while transportation and energy sectors are transitioning to lower-carbon technologies. Using detailed forest inventory data for the conterminous US, we estimate forests' current net sequestration of atmospheric C to be 173 Tg yr(-1), offsetting 9.7% of C emissions from transportation and energy sources. Accounting for multiple driving variables, we project a gradual decline in the forest C emission sink over the next 25 years (to 112 Tg yr(-1)) with regional differences. Sequestration in eastern regions declines gradually while sequestration in the Rocky Mountain region declines rapidly and could become a source of atmospheric C due to disturbances such as fire and insect epidemics. C sequestration in the Pacific Coast region stabilizes as forests harvested in previous decades regrow. Scenarios simulating climate-induced productivity enhancement and afforestation policies increase sequestration rates, but would not fully offset declines from aging and forest disturbances. Separating C transfers associated with land use changes from sequestration clarifies forests' role in reducing net emissions and demonstrates that retention of forest land is crucial for protecting or enhancing sink strength.
NASA Astrophysics Data System (ADS)
David, Peter H.; Hommel, Marcel; Miller, Louis H.; Udeinya, Iroka J.; Oligino, Lynette D.
1983-08-01
Sequestration, the adherence of infected erythrocytes containing late developmental stages of the parasite (trophozoites and schizonts) to the endothelium of capillaries and venules, is characteristic of Plasmodium falciparum infections. We have studied two host factors, the spleen and antibody, that influence sequestration of P. falciparum in the squirrel monkey. Sequestration of trophozoite/schizont-infected erythrocytes that occurs in intact animals is reduced in splenectomized animals; in vitro, when infected blood is incubated with monolayers of human melanoma cells, trophozoite/schizont-infected erythrocytes from intact animals but not from splenectomized animals bind to the melanoma cells. The switch in cytoadherence characteristics of the infected erythrocytes from nonbinding to binding occurs with a cloned parasite. Immune serum can inhibit and reverse in vitro binding to melanoma cells of infected erythrocytes from intact animals. Similarly, antibody can reverse in vivo sequestration as shown by the appearance of trophozoite/schizont-infected erythrocytes in the peripheral blood of an intact animal after inoculation with immune serum. These results indicate that the spleen modulates the expression of parasite alterations of the infected erythrocyte membrane responsible for sequestration and suggest that the prevention and reversal of sequestration could be one of the effector mechanisms involved in antibody-mediated protection against P. falciparum malaria.
Software Measurement Guidebook
NASA Technical Reports Server (NTRS)
1995-01-01
This Software Measurement Guidebook is based on the extensive experience of several organizations that have each developed and applied significant measurement programs over a period of at least 10 years. The lessons derived from those experiences reflect not only successes but also failures. By applying those lessons, an organization can minimize, or at least reduce, the time, effort, and frustration of introducing a software measurement program. The Software Measurement Guidebook is aimed at helping organizations to begin or improve a measurement program. It does not provide guidance for the extensive application of specific measures (such as how to estimate software cost or analyze software complexity) other than by providing examples to clarify points. It does contain advice for establishing and using an effective software measurement program and for understanding some of the key lessons that other organizations have learned. Some of that advice will appear counterintuitive, but it is all based on actual experience. Although all of the information presented in this guidebook is derived from specific experiences of mature measurement programs, the reader must keep in mind that the characteristics of every organization are unique. Some degree of measurement is critical for all software development and maintenance organizations, and most of the key rules captured in this report will be generally applicable. Nevertheless, each organization must strive to understand its own environment so that the measurement program can be tailored to suit its characteristics and needs.
CCP4i2: the new graphical user interface to the CCP4 program suite.
Potterton, Liz; Agirre, Jon; Ballard, Charles; Cowtan, Kevin; Dodson, Eleanor; Evans, Phil R; Jenkins, Huw T; Keegan, Ronan; Krissinel, Eugene; Stevenson, Kyle; Lebedev, Andrey; McNicholas, Stuart J; Nicholls, Robert A; Noble, Martin; Pannu, Navraj S; Roth, Christian; Sheldrick, George; Skubak, Pavol; Turkenburg, Johan; Uski, Ville; von Delft, Frank; Waterman, David; Wilson, Keith; Winn, Martyn; Wojdyr, Marcin
2018-02-01
The CCP4 (Collaborative Computational Project, Number 4) software suite for macromolecular structure determination by X-ray crystallography groups brings together many programs and libraries that, by means of well established conventions, interoperate effectively without adhering to strict design guidelines. Because of this inherent flexibility, users are often presented with diverse, even divergent, choices for solving every type of problem. Recently, CCP4 introduced CCP4i2, a modern graphical interface designed to help structural biologists to navigate the process of structure determination, with an emphasis on pipelining and the streamlined presentation of results. In addition, CCP4i2 provides a framework for writing structure-solution scripts that can be built up incrementally to create increasingly automatic procedures.
Online Monitoring of Induction Motors
DOE Office of Scientific and Technical Information (OSTI.GOV)
McJunkin, Timothy R.; Agarwal, Vivek; Lybeck, Nancy Jean
2016-01-01
The online monitoring of active components project, under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program, researched diagnostic and prognostic models for alternating current induction motors (IM). Idaho National Laboratory (INL) worked with the Electric Power Research Institute (EPRI) to augment and revise the fault signatures previously implemented in the Asset Fault Signature Database of EPRI’s Fleet Wide Prognostic and Health Management (FW PHM) Suite software. Induction Motor diagnostic models were researched using the experimental data collected by Idaho State University. Prognostic models were explored in the set of literature and through amore » limited experiment with 40HP to seek the Remaining Useful Life Database of the FW PHM Suite.« less
NASA Astrophysics Data System (ADS)
Minasny, Budiman; van Wesemael, Bas
2017-04-01
The '4 per mille Soils for Food Security and Climate' was launched at the COP21 aiming to increase global soil organic matter stocks by 4 per mille (or 0.4 %) per year as a compensation for the global emissions of greenhouse gases by anthropogenic sources. This paper surveyed the soil organic carbon (SOC) stock estimates and sequestration potentials from 20 regions in the world (New Zealand, Chile, South Africa, Australia, Tanzania, Indonesia, Kenya, Nigeria, India, China Taiwan, South Korea, China Mainland, United States of America, France, Canada, Belgium, England & Wales, Ireland, Scotland, and Russia) and asked whether the 4 per mille initiative is feasible. This study highlights region specific efforts and scopes for soil carbon sequestration. Reported soil C sequestration rates generally show that under best management practices, 4 per mille or even higher sequestration rates can be accomplished. High C sequestration rates (up to 10 per mille) can be achieved for soils with low initial SOC stock (topsoil less than 30 t C ha-1), and at the first twenty years after implementation of best management practices. In addition, areas that have reached equilibrium but not at their saturation level will not be able to further increase their sequestration. We found that most studies on SOC sequestration globally only consider topsoil (up to 0.3 m depth), as it is considered to be most affected by management techniques. The 4 per mille initiative was based on a blanket calculation of the whole global soil profile C stock, however the potential to increase SOC is mostly on managed agricultural lands. If we consider 4 per mille on global topsoil of agricultural land, SOC sequestration is about 3.6 Gt C per year, which effectively offset 40% of global anthropogenic greenhouse gas emissions. As a strategy for climate change mitigation, soil carbon sequestration buys time over the next ten to twenty years while other effective sequestration and low carbon technologies become viable. The challenge for cropping farmers is to find disruptive technologies that will further improve soil condition and deliver increased soil carbon. Progress in 4 per mille requires collaboration and communication between scientists, farmers, policy makers, and marketeers.
Biodegradation of organic chemicals in soil/water microcosms system: Model development
Liu, L.; Tindall, J.A.; Friedel, M.J.; Zhang, W.
2007-01-01
The chemical interactions of hydrophobic organic contaminants with soils and sediments may result in strong binding and slow subsequent release rates that significantly affect remediation rates and endpoints. In order to illustrate the recalcitrance of chemical to degradation on sites, a sorption mechanism of intraparticle sequestration was postulated to operate on chemical remediation sites. Pseudo-first order sequestration kinetics is used in the study with the hypothesis that sequestration is an irreversibly surface-mediated process. A mathematical model based on mass balance equations was developed to describe the fate of chemical degradation in soil/water microcosm systems. In the model, diffusion was represented by Fick's second law, local sorption-desorption by a linear isotherm, irreversible sequestration by a pseudo-first order kinetics and biodegradation by Monod kinetics. Solutions were obtained to provide estimates of chemical concentrations. The mathematical model was applied to a benzene biodegradation batch test and simulated model responses correlated well compared to measurements of biodegradation of benzene in the batch soil/water microcosm system. A sensitivity analysis was performed to assess the effects of several parameters on model behavior. Overall chemical removal rate decreased and sequestration increased quickly with an increase in the sorption partition coefficient. When soil particle radius, a, was greater than 1 mm, an increase in radius produced a significant decrease in overall chemical removal rate as well as an increase in sequestration. However, when soil particle radius was less than 0.1 mm, an increase in radius resulted in small changes in the removal rate and sequestration. As pseudo-first order sequestration rate increased, both chemical removal rate and sequestration increased slightly. Model simulation results showed that desorption resistance played an important role in the bioavailability of organic chemicals in porous media. Complete biostabilization of chemicals on remediation sites can be achieved when the concentration of the reversibly sorbed chemical reduces to zero (i.e., undetectable), with a certain amount of irreversibly sequestrated chemical left inside the soil particle solid phase. ?? 2006 Springer Science + Business Media B.V.
Vertically-integrated Approaches for Carbon Sequestration Modeling
NASA Astrophysics Data System (ADS)
Bandilla, K.; Celia, M. A.; Guo, B.
2015-12-01
Carbon capture and sequestration (CCS) is being considered as an approach to mitigate anthropogenic CO2 emissions from large stationary sources such as coal fired power plants and natural gas processing plants. Computer modeling is an essential tool for site design and operational planning as it allows prediction of the pressure response as well as the migration of both CO2 and brine in the subsurface. Many processes, such as buoyancy, hysteresis, geomechanics and geochemistry, can have important impacts on the system. While all of the processes can be taken into account simultaneously, the resulting models are computationally very expensive and require large numbers of parameters which are often uncertain or unknown. In many cases of practical interest, the computational and data requirements can be reduced by choosing a smaller domain and/or by neglecting or simplifying certain processes. This leads to a series of models with different complexity, ranging from coupled multi-physics, multi-phase three-dimensional models to semi-analytical single-phase models. Under certain conditions the three-dimensional equations can be integrated in the vertical direction, leading to a suite of two-dimensional multi-phase models, termed vertically-integrated models. These models are either solved numerically or simplified further (e.g., assumption of vertical equilibrium) to allow analytical or semi-analytical solutions. This presentation focuses on how different vertically-integrated models have been applied to the simulation of CO2 and brine migration during CCS projects. Several example sites, such as the Illinois Basin and the Wabamun Lake region of the Alberta Basin, are discussed to show how vertically-integrated models can be used to gain understanding of CCS operations.
Wilkins, Michael J.; Hoyt, David W.; Marshall, Matthew J.; Alderson, Paul A.; Plymale, Andrew E.; Markillie, L. Meng; Tucker, Abby E.; Walter, Eric D.; Linggi, Bryan E.; Dohnalkova, Alice C.; Taylor, Ron C.
2014-01-01
Geologic carbon dioxide (CO2) sequestration drives physical and geochemical changes in deep subsurface environments that impact indigenous microbial activities. The combined effects of pressurized CO2 on a model sulfate-reducing microorganism, Desulfovibrio vulgaris, have been assessed using a suite of genomic and kinetic measurements. Novel high-pressure NMR time-series measurements using 13C-lactate were used to track D. vulgaris metabolism. We identified cessation of respiration at CO2 pressures of 10 bar, 25 bar, 50 bar, and 80 bar. Concurrent experiments using N2 as the pressurizing phase had no negative effect on microbial respiration, as inferred from reduction of sulfate to sulfide. Complementary pressurized batch incubations and fluorescence microscopy measurements supported NMR observations, and indicated that non-respiring cells were mostly viable at 50 bar CO2 for at least 4 h, and at 80 bar CO2 for 2 h. The fraction of dead cells increased rapidly after 4 h at 80 bar CO2. Transcriptomic (RNA-Seq) measurements on mRNA transcripts from CO2-incubated biomass indicated that cells up-regulated the production of certain amino acids (leucine, isoleucine) following CO2 exposure at elevated pressures, likely as part of a general stress response. Evidence for other poorly understood stress responses were also identified within RNA-Seq data, suggesting that while pressurized CO2 severely limits the growth and respiration of D. vulgaris cells, biomass retains intact cell membranes at pressures up to 80 bar CO2. Together, these data show that geologic sequestration of CO2 may have significant impacts on rates of sulfate reduction in many deep subsurface environments where this metabolism is a key respiratory process. PMID:25309528
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilkins, Michael J.; Hoyt, David W.; Marshall, Matthew J.
Geologic carbon dioxide (CO2) sequestration drives physical and geochemical changes in deep subsurface environments that impact indigenous microbial activities. The combined effects of pressurized CO2 on a model sulfate-reducing microorganism, Desulfovibrio vulgaris, have been assessed using a suite of genomic and kinetic measurements. Novel high-pressure NMR time-series measurements using 13C-lactate were used to track D. vulgaris metabolism. We identified cessation of respiration at CO2 pressures of 10 bar, 25 bar, 50 bar, and 80 bar. Concurrent experiments using N2 as the pressurizing phase had no negative effect on microbial respiration, as inferred from reduction of sulfate to sulfide. Complementary pressurizedmore » batch incubations and fluorescence microscopy measurements supported NMR observations, and indicated that non-respiring cells were mostly viable at 50 bar CO2 for at least four hours, and at 80 bar CO2 for two hours. The fraction of dead cells increased rapidly after four hours at 80 bar CO2. Transcriptomic (RNA-Seq) measurements on mRNA transcripts from CO2-incubated biomass indicated that cells up-regulated the production of certain amino acids (leucine, isoleucine) following CO2 exposure at elevated pressures, likely as part of a general stress response. Evidence for other poorly understood stress responses were also identified within RNA-Seq data, suggesting that while pressurized CO2 severely limits the growth and respiration of D. vulgaris cells, biomass retains intact cell membranes at pressures up to 80 bar CO2. Together, these data show that geologic sequestration of CO2 may have significant impacts on rates of sulfate reduction in many deep subsurface environments where this metabolism is a key respiratory process.« less
Global carbon export from the terrestrial biosphere controlled by erosion.
Galy, Valier; Peucker-Ehrenbrink, Bernhard; Eglinton, Timothy
2015-05-14
Riverine export of particulate organic carbon (POC) to the ocean affects the atmospheric carbon inventory over a broad range of timescales. On geological timescales, the balance between sequestration of POC from the terrestrial biosphere and oxidation of rock-derived (petrogenic) organic carbon sets the magnitude of the atmospheric carbon and oxygen reservoirs. Over shorter timescales, variations in the rate of exchange between carbon reservoirs, such as soils and marine sediments, also modulate atmospheric carbon dioxide levels. The respective fluxes of biospheric and petrogenic organic carbon are poorly constrained, however, and mechanisms controlling POC export have remained elusive, limiting our ability to predict POC fluxes quantitatively as a result of climatic or tectonic changes. Here we estimate biospheric and petrogenic POC fluxes for a suite of river systems representative of the natural variability in catchment properties. We show that export yields of both biospheric and petrogenic POC are positively related to the yield of suspended sediment, revealing that POC export is mostly controlled by physical erosion. Using a global compilation of gauged suspended sediment flux, we derive separate estimates of global biospheric and petrogenic POC fluxes of 157(+74)(-50) and 43(+61)(-25) megatonnes of carbon per year, respectively. We find that biospheric POC export is primarily controlled by the capacity of rivers to mobilize and transport POC, and is largely insensitive to the magnitude of terrestrial primary production. Globally, physical erosion rates affect the rate of biospheric POC burial in marine sediments more strongly than carbon sequestration through silicate weathering. We conclude that burial of biospheric POC in marine sediments becomes the dominant long-term atmospheric carbon dioxide sink under enhanced physical erosion.
Vegetation carbon sequestration in Chinese forests from 2010 to 2050.
He, Nianpeng; Wen, Ding; Zhu, Jianxing; Tang, Xuli; Xu, Li; Zhang, Li; Hu, Huifeng; Huang, Mei; Yu, Guirui
2017-04-01
Forests store a large part of the terrestrial vegetation carbon (C) and have high C sequestration potential. Here, we developed a new forest C sequestration (FCS) model based on the secondary succession theory, to estimate vegetation C sequestration capacity in China's forest vegetation. The model used the field measurement data of 3161 forest plots and three future climate scenarios. The results showed that logistic equations provided a good fit for vegetation biomass with forest age in natural and planted forests. The FCS model has been verified with forest biomass data, and model uncertainty is discussed. The increment of vegetation C storage in China's forest vegetation from 2010 to 2050 was estimated as 13.92 Pg C, while the average vegetation C sequestration rate was 0.34 Pg C yr -1 with a 95% confidence interval of 0.28-0.42 Pg C yr -1 , which differed significantly between forest types. The largest contributor to the increment was deciduous broadleaf forest (37.8%), while the smallest was deciduous needleleaf forest (2.7%). The vegetation C sequestration rate might reach its maximum around 2020, although vegetation C storage increases continually. It is estimated that vegetation C sequestration might offset 6-8% of China's future emissions. Furthermore, there was a significant negative relationship between vegetation C sequestration rate and C emission rate in different provinces of China, suggesting that developed provinces might need to compensate for undeveloped provinces through C trade. Our findings will provide valuable guidelines to policymakers for designing afforestation strategies and forest C trade in China. © 2016 John Wiley & Sons Ltd.
The role of composition, invasives, and maintenance emissions on urban forest carbon stocks.
Horn, Josh; Escobedo, Francisco J; Hinkle, Ross; Hostetler, Mark; Timilsina, Nilesh
2015-02-01
There are few field-based, empirical studies quantifying the effect of invasive trees and palms and maintenance-related carbon emissions on changes in urban forest carbon stocks. We estimated carbon (C) stock changes and tree maintenance-related C emissions in a subtropical urban forest by re-measuring a subsample of residential permanent plots during 2009 and 2011, using regional allometric biomass equations, and surveying residential homeowners near Orlando, FL, USA. The effect of native, non-native, invasive tree species and palms on C stocks and sequestration was also quantified. Findings show 17.8 tC/ha in stocks and 1.2 tC/ha/year of net sequestration. The most important species both by frequency of C stocks and sequestration were Quercus laurifolia Michx. and Quercus virginiana Mill., accounting for 20% of all the trees measured; 60% of carbon stocks and over 75% of net C sequestration. Palms contributed to less than 1% of the total C stocks. Natives comprised two-thirds of the tree population and sequestered 90% of all C, while invasive trees and palms accounted for 5 % of net C sequestration. Overall, invasive and exotic trees had a limited contribution to total C stocks and sequestration. Annual tree-related maintenance C emissions were 0.1% of total gross C sequestration. Plot-level tree, palm, and litter cover were correlated to C stocks and net sequestration. Findings can be used to complement existing urban forest C offset accounting and monitoring protocols and to better understand the role of invasive woody plants on urban ecosystem service provision.
The Role of Composition, Invasives, and Maintenance Emissions on Urban Forest Carbon Stocks
NASA Astrophysics Data System (ADS)
Horn, Josh; Escobedo, Francisco J.; Hinkle, Ross; Hostetler, Mark; Timilsina, Nilesh
2015-02-01
There are few field-based, empirical studies quantifying the effect of invasive trees and palms and maintenance-related carbon emissions on changes in urban forest carbon stocks. We estimated carbon (C) stock changes and tree maintenance-related C emissions in a subtropical urban forest by re-measuring a subsample of residential permanent plots during 2009 and 2011, using regional allometric biomass equations, and surveying residential homeowners near Orlando, FL, USA. The effect of native, non-native, invasive tree species and palms on C stocks and sequestration was also quantified. Findings show 17.8 tC/ha in stocks and 1.2 tC/ha/year of net sequestration. The most important species both by frequency of C stocks and sequestration were Quercus laurifolia Michx. and Quercus virginiana Mill., accounting for 20 % of all the trees measured; 60 % of carbon stocks and over 75 % of net C sequestration. Palms contributed to less than 1 % of the total C stocks. Natives comprised two-thirds of the tree population and sequestered 90 % of all C, while invasive trees and palms accounted for 5 % of net C sequestration. Overall, invasive and exotic trees had a limited contribution to total C stocks and sequestration. Annual tree-related maintenance C emissions were 0.1 % of total gross C sequestration. Plot-level tree, palm, and litter cover were correlated to C stocks and net sequestration. Findings can be used to complement existing urban forest C offset accounting and monitoring protocols and to better understand the role of invasive woody plants on urban ecosystem service provision.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-14
... Lake Charles Carbon Capture and Sequestration Project (DOE/EIS-0464D) AGENCY: U.S. Department of Energy...) announces the availability of the Lake Charles Carbon Capture and Sequestration Project Draft [[Page 28206... potential environmental impacts associated with the Lake Charles Carbon Capture and Sequestration Project...
Geologic carbon sequestration has the potential to cause long-term reductions in global emissions of carbon dioxide to the atmosphere. Safe and effective application of carbon sequestration technology requires an understanding of the potential risks to the quality of underground...
Hurricane impacts on US forest carbon sequestration
Steven G. McNulty
2002-01-01
Recent focus has been given to US forests as a sink for increases in atmospheric carbon dioxide. Current estimates of US Forest carbon sequestration average approximately 20 Tg (i.e. 1012 g) year. However, predictions of forest carbon sequestration often do not include the influence of hurricanes on forest carbon storage. Intense hurricanes...
Yield and soil carbon sequestration in grazed pastures sown with two or five forage species
USDA-ARS?s Scientific Manuscript database
Increasing plant species richness is often associated with an increase in productivity and associated ecosystem services such as soil C sequestration. In this paper we report on a nine-year experiment to evaluate the relative forage production and C sequestration potential of grazed pastures sown to...
Integrated Mid-Continent Carbon Capture, Sequestration & Enhanced Oil Recovery Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian McPherson
2010-08-31
A consortium of research partners led by the Southwest Regional Partnership on Carbon Sequestration and industry partners, including CAP CO2 LLC, Blue Source LLC, Coffeyville Resources, Nitrogen Fertilizers LLC, Ash Grove Cement Company, Kansas Ethanol LLC, Headwaters Clean Carbon Services, Black & Veatch, and Schlumberger Carbon Services, conducted a feasibility study of a large-scale CCS commercialization project that included large-scale CO{sub 2} sources. The overall objective of this project, entitled the 'Integrated Mid-Continent Carbon Capture, Sequestration and Enhanced Oil Recovery Project' was to design an integrated system of US mid-continent industrial CO{sub 2} sources with CO{sub 2} capture, and geologicmore » sequestration in deep saline formations and in oil field reservoirs with concomitant EOR. Findings of this project suggest that deep saline sequestration in the mid-continent region is not feasible without major financial incentives, such as tax credits or otherwise, that do not exist at this time. However, results of the analysis suggest that enhanced oil recovery with carbon sequestration is indeed feasible and practical for specific types of geologic settings in the Midwestern U.S.« less
Bringing the Unidata IDV to the Cloud
NASA Astrophysics Data System (ADS)
Fisher, W. I.; Oxelson Ganter, J.
2015-12-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.
Ocean sequestration of crop residue carbon: recycling fossil fuel carbon back to deep sediments.
Strand, Stuart E; Benford, Gregory
2009-02-15
For significant impact any method to remove CO2 from the atmosphere must process large amounts of carbon efficiently, be repeatable, sequester carbon for thousands of years, be practical, economical and be implemented soon. The only method that meets these criteria is removal of crop residues and burial in the deep ocean. We show here that this method is 92% efficient in sequestration of crop residue carbon while cellulosic ethanol production is only 32% and soil sequestration is about 14% efficient. Deep ocean sequestration can potentially capture 15% of the current global CO2 annual increase, returning that carbon backto deep sediments, confining the carbon for millennia, while using existing capital infrastructure and technology. Because of these clear advantages, we recommend enhanced research into permanent sequestration of crop residues in the deep ocean.
Recovery Act: Web-based CO{sub 2} Subsurface Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paolini, Christopher; Castillo, Jose
2012-11-30
The Web-based CO{sub 2} Subsurface Modeling project focused primarily on extending an existing text-only, command-line driven, isothermal and isobaric, geochemical reaction-transport simulation code, developed and donated by Sienna Geodynamics, into an easier-to-use Web-based application for simulating long-term storage of CO{sub 2} in geologic reservoirs. The Web-based interface developed through this project, publically accessible via URL http://symc.sdsu.edu/, enables rapid prototyping of CO{sub 2} injection scenarios and allows students without advanced knowledge of geochemistry to setup a typical sequestration scenario, invoke a simulation, analyze results, and then vary one or more problem parameters and quickly re-run a simulation to answer what-if questions.more » symc.sdsu.edu has 2x12 core AMD Opteron™ 6174 2.20GHz processors and 16GB RAM. The Web-based application was used to develop a new computational science course at San Diego State University, COMP 670: Numerical Simulation of CO{sub 2} Sequestration, which was taught during the fall semester of 2012. The purpose of the class was to introduce graduate students to Carbon Capture, Use and Storage (CCUS) through numerical modeling and simulation, and to teach students how to interpret simulation results to make predictions about long-term CO{sub 2} storage capacity in deep brine reservoirs. In addition to the training and education component of the project, significant software development efforts took place. Two computational science doctoral and one geological science masters student, under the direction of the PIs, extended the original code developed by Sienna Geodynamics, named Sym.8. New capabilities were added to Sym.8 to simulate non-isothermal and non-isobaric flows of charged aqueous solutes in porous media, in addition to incorporating HPC support into the code for execution on many-core XSEDE clusters. A successful outcome of this project was the funding and training of three new computational science students and one geological science student in technologies relevant to carbon sequestration and problems involving flow in subsurface media. The three computational science students are currently finishing their doctorial studies on different aspects of modeling CO{sub 2} sequestration, while the geological science student completed his master’s thesis in modeling the thermal response of CO{sub 2} injection in brine and, as a direct result of participation in this project, is now employed at ExxonMobil as a full-time staff geologist.« less
Nanobiotechnology for enzymatic remediation and soil carbon sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jungbae; Amonette, James E.; Russell, Colleen K.
2005-03-13
We studied the ability of tyrosinase to catalyze the oxidation of various phenolic compounds. As a revolutionary approach to enzyme stabilization, we developed specially-designed nanoporous silica for enzyme immobilization. Our tests show that the active lifetime of the enzymes stabilized in this material can extend to periods as long as several months, which is about a 100-fold increase in stability. The implications of this new approach to enzyme-based bioremedation will be discussed. In soils, the humification process involves phenol oxidation, mediated by tyrosinase, followed by nonenzymatic polymerization of the resulting quinones with amino acids to form humic polymers. We testedmore » the effects of fly ash amendments on a model humification reaction involving tyrosinase and a suite of organic monomers. The combination of fly ashes with tyrosinase increased the amount of polymer formed by several fold. The strong synergetic effect of these ashes when enzyme is present apparently arises from the combined effects of alkaline pH and physical stabilization of the enzyme in porous silica cenospheres.« less
Hua, Keke; Wang, Daozhong; Guo, Xisheng; Guo, Zibin
2014-01-01
Soil organic carbon (SOC) sequestration is important for improving soil fertility of cropland and for the mitigation of greenhouse gas emissions to the atmosphere. The efficiency of SOC sequestration depends on the quantity and quality of the organic matter, soil type, and climate. Little is known about the SOC sequestration efficiency of organic amendments in Vertisols. Thus, we conducted the research based on 29 years (1982-2011) of long-term fertilization experiment with a no fertilizer control and five fertilization regimes: CK (control, no fertilizer), NPK (mineral NPK fertilizers alone), NPK+1/2W (mineral NPK fertilizers combined with half the amount of wheat straw), NPK+W (mineral NPK fertilizers combined with full the amount of wheat straw), NPK+PM (mineral NPK fertilizers combined with pig manure) and NPK+CM (mineral NPK fertilizers combined cattle manure). Total mean annual C inputs were 0.45, 1.55, 2.66, 3.71, 4.68 and 6.56 ton/ha/yr for CK, NPK, NPKW1/2, NPKW, NPKPM and NPKCM, respectively. Mean SOC sequestration rate was 0.20 ton/ha/yr in the NPK treatment, and 0.39, 0.50, 0.51 and 0.97 ton/ha/yr in the NPKW1/2, NPKW, NPKPM, and NPKCM treatments, respectively. A linear relationship was observed between annual C input and SOC sequestration rate (SOCsequestration rate = 0.16 Cinput -0.10, R = 0.95, P<0.01), suggesting a C sequestration efficiency of 16%. The Vertisol required an annual C input of 0.63 ton/ha/yr to maintain the initial SOC level. Moreover, the C sequestration efficiencies of wheat straw, pig manure and cattle manure were 17%, 11% and 17%, respectively. The results indicate that the Vertisol has a large potential to sequester SOC with a high efficiency, and applying cattle manure or wheat straw is a recommendable SOC sequestration practice in Vertisols.
Hua, Keke; Wang, Daozhong; Guo, Xisheng; Guo, Zibin
2014-01-01
Soil organic carbon (SOC) sequestration is important for improving soil fertility of cropland and for the mitigation of greenhouse gas emissions to the atmosphere. The efficiency of SOC sequestration depends on the quantity and quality of the organic matter, soil type, and climate. Little is known about the SOC sequestration efficiency of organic amendments in Vertisols. Thus, we conducted the research based on 29 years (1982–2011) of long-term fertilization experiment with a no fertilizer control and five fertilization regimes: CK (control, no fertilizer), NPK (mineral NPK fertilizers alone), NPK+1/2W (mineral NPK fertilizers combined with half the amount of wheat straw), NPK+W (mineral NPK fertilizers combined with full the amount of wheat straw), NPK+PM (mineral NPK fertilizers combined with pig manure) and NPK+CM (mineral NPK fertilizers combined cattle manure). Total mean annual C inputs were 0.45, 1.55, 2.66, 3.71, 4.68 and 6.56 ton/ha/yr for CK, NPK, NPKW1/2, NPKW, NPKPM and NPKCM, respectively. Mean SOC sequestration rate was 0.20 ton/ha/yr in the NPK treatment, and 0.39, 0.50, 0.51 and 0.97 ton/ha/yr in the NPKW1/2, NPKW, NPKPM, and NPKCM treatments, respectively. A linear relationship was observed between annual C input and SOC sequestration rate (SOCsequestration rate = 0.16 Cinput –0.10, R = 0.95, P<0.01), suggesting a C sequestration efficiency of 16%. The Vertisol required an annual C input of 0.63 ton/ha/yr to maintain the initial SOC level. Moreover, the C sequestration efficiencies of wheat straw, pig manure and cattle manure were 17%, 11% and 17%, respectively. The results indicate that the Vertisol has a large potential to sequester SOC with a high efficiency, and applying cattle manure or wheat straw is a recommendable SOC sequestration practice in Vertisols. PMID:25265095
Characterization of Most Promising Sequestration Formations in the Rocky Mountain Region (RMCCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McPherson, Brian; Matthews, Vince
2013-09-30
The primary objective of the “Characterization of Most Promising Carbon Capture and Sequestration Formations in the Central Rocky Mountain Region” project, or RMCCS project, is to characterize the storage potential of the most promising geologic sequestration formations within the southwestern U.S. and the Central Rocky Mountain region in particular. The approach included an analysis of geologic sequestration formations under the Craig Power Station in northwestern Colorado, and application or extrapolation of those local-scale results to the broader region. A ten-step protocol for geologic carbon storage site characterization was a primary outcome of this project.
ERIC Educational Resources Information Center
Ezekiel, Aaron B.
At the University of New Mexico, stakeholders from the Computer and Information Resources and Technology (CIRT) department, Financial Systems, the Health Sciences Center, and the General Libraries, were involved in deciding on the goals of a project to replace Telnet with a suite of network middleware and productivity software on campus computer…
2009-01-01
pro- gram requirements, and administering local and federal funding. Emergency services—organizations that provide for public safety by the...chemicals Nerve agent Chlorine tank explosion Major earthquake Major hurricane Radiological dispersal device Improvised explosive device Food ...state Locally Developed Software 1 city 1 county 1 city 1 county 3 states Lotus Notes Suite 1 NGO MABAS.ORG 1 county
Jet and Tropopause Products for Analysis and Characterization (JETPAC)
NASA Technical Reports Server (NTRS)
Manney, Gloria L.; Daffer, William H.
2012-01-01
This suite of IDL programs provides identification and comprehensive characterization of the dynamical features of the jet streams in the upper troposphere, the lower stratospheric polar night jet, and the tropopause. The output of this software not only provides comprehensive information on the jets and tropopause, but also gives this information in a form that facilitates studies of observations in relation to the jets and tropopauses.
Artificial Intelligence: The Bumpy Path Through Defense Acquisition
2017-12-01
products through Amazon’s suite of services , or can be trained using the Alexa application to interact and control other smart products in your house...software, and capitalizing on the opportunities for customization and consultation. NVIDIA’s approach to AI hardware, offers opportunities for garage...have teamed up to provide licensing, training , and development services for a product called Unreal Engine 4, aimed at government and military
ERIC Educational Resources Information Center
Weber, Jonathan
2006-01-01
Creating a digital library might seem like a task best left to a large research collection with a vast staff and generous budget. However, tools for successfully creating digital libraries are getting easier to use all the time. The explosion of people creating content for the web has led to the availability of many high-quality applications and…
Distributed Power Systems for Sustainable Energy
2012-10-01
capital investment in state-of- the-art cogeneration technologies, renewable sources, energy storage, and interconnection hardware and software. It is...8 capacity may not be well suited to support building or campus-scale microgrids. This is because new thermal and electrical energy storage devices...constraints, as well as the site location, weather, and consumption patterns. These factors change over the life of the energy microgrid. • Tradeoffs
An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua
2011-07-09
Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Gridmore » Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.« less
CosmoQuest Transient Tracker: Opensource Photometry & Astrometry software
NASA Astrophysics Data System (ADS)
Myers, Joseph L.; Lehan, Cory; Gay, Pamela; Richardson, Matthew; CosmoQuest Team
2018-01-01
CosmoQuest is moving from online citizen science, to observational astronomy with the creation of Transient Trackers. This open source software is designed to identify asteroids and other transient/variable objects in image sets. Transient Tracker’s features in final form will include: astrometric and photometric solutions, identification of moving/transient objects, identification of variable objects, and lightcurve analysis. In this poster we present our initial, v0.1 release and seek community input.This software builds on the existing NIH funded ImageJ libraries. Creation of this suite of opensource image manipulation routines is lead by Wayne Rasband and is released primarily under the MIT license. In this release, we are building on these libraries to add source identification for point / point-like sources, and to do astrometry. Our materials released under the Apache 2.0 license on github (http://github.com/CosmoQuestTeam) and documentation can be found at http://cosmoquest.org/TransientTracker.
A Robust and Scalable Software Library for Parallel Adaptive Refinement on Unstructured Meshes
NASA Technical Reports Server (NTRS)
Lou, John Z.; Norton, Charles D.; Cwik, Thomas A.
1999-01-01
The design and implementation of Pyramid, a software library for performing parallel adaptive mesh refinement (PAMR) on unstructured meshes, is described. This software library can be easily used in a variety of unstructured parallel computational applications, including parallel finite element, parallel finite volume, and parallel visualization applications using triangular or tetrahedral meshes. The library contains a suite of well-designed and efficiently implemented modules that perform operations in a typical PAMR process. Among these are mesh quality control during successive parallel adaptive refinement (typically guided by a local-error estimator), parallel load-balancing, and parallel mesh partitioning using the ParMeTiS partitioner. The Pyramid library is implemented in Fortran 90 with an interface to the Message-Passing Interface (MPI) library, supporting code efficiency, modularity, and portability. An EM waveguide filter application, adaptively refined using the Pyramid library, is illustrated.
Hazan, Lynn; Zugaro, Michaël; Buzsáki, György
2006-09-15
Recent technological advances now allow for simultaneous recording of large populations of anatomically distributed neurons in behaving animals. The free software package described here was designed to help neurophysiologists process and view recorded data in an efficient and user-friendly manner. This package consists of several well-integrated applications, including NeuroScope (http://neuroscope.sourceforce.net), an advanced viewer for electrophysiological and behavioral data with limited editing capabilities, Klusters (http://klusters.sourceforge.net), a graphical cluster cutting application for manual and semi-automatic spike sorting, NDManager (GPL,see http://www.gnu.org/licenses/gpl.html), an experimental parameter and data processing manager. All of these programs are distributed under the GNU General Public License (GPL, see ), which gives its users legal permission to copy, distribute and/or modify the software. Also included are extensive user manuals and sample data, as well as source code and documentation.
Customizing graphical user interface technology for spacecraft control centers
NASA Technical Reports Server (NTRS)
Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald
1993-01-01
The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Omar R.; Kuo, Li-Jung; Zimmerman, Andrew R.
2012-01-10
The ability of engineered black carbons (or biochars) to resist abiotic and, or biotic degradation (herein referred to as recalcitrance) is crucial to their successful deployment as a soil carbon sequestration strategy. A new recalcitrance index, the R{sub 50}, for assessing biochar quality for carbon sequestration is proposed. The R{sub 50} is based on the relative thermal stability of a given biochar to that of graphite and was developed and evaluated with a variety of biochars (n = 59), and soot-like black carbons. Comparison of R{sub 50}, with biochar physicochemical properties and biochar-C mineralization revealed the existence of a quantifiablemore » relationship between R{sub 50} and biochar recalcitrance. As presented here, the R{sub 50} is immediately applicable to pre-land application screening of biochars into Class A (R{sub 50} {>=} 0.70), Class B (0.50 {<=} R{sub 50} < 0.70) or Class C (R{sub 50} < 0.50) recalcitrance/carbon sequestration classes. Class A and Class C biochars would have carbon sequestration potential comparable to soot/graphite and uncharred plant biomass, respectively, while Class B biochars would have intermediate carbon sequestration potential. We believe that the coupling of the R{sub 50}, to an index-based degradation, and an economic model could provide a suitable framework in which to comprehensively assess soil carbon sequestration in biochars.« less
Implication of soil C sequestration on sustainable agriculture and environment.
Mondini, C; Sequi, P
2008-01-01
Soil organic matter (SOM) is the largest C stock of the continental biosphere with 1550Pg. The size of C reservoir in the soil and environmental concerns on climate change have recently attracted the attention of scientist and politicians on C sequestration as an effective strategy to tackle greenhouse gas (GHG) emissions. It has been estimated that the potential for C storage in world cropland is relevant (about 0.6-1.2PgCy(-1)). However, there are several constraints of C sequestration that raise concern about its effectiveness as a strategy to offset climate change. C sequestration is finite in quantity and time, reversible, and can be further decreased by socio-economic restrictions. Given these limitations, C sequestration can play only a minor role in the reduction of emissions (2-5% of total GHG emission under the highest emission scenarios). Yet, C sequestration is still attractive for two main reasons: it is likely to be particularly effective in reducing atmospheric CO2 levels in the first 20-30yr of its implementation and presents ancillary benefits for environment and sustainability that make it a real win-win strategy. These beneficial implications are discussed in this paper with emphasis on the need of C sequestration not only to offset climatic changes, but also for the equilibria of the environment and for the sustainability of agriculture and of entire human society.
Okyay, Tugba Onal; Rodrigues, Debora F
2015-03-01
In this study, CO2 sequestration was investigated through the microbially-induced calcium carbonate precipitation (MICP) process with isolates obtained from a cave called 'Cave Without A Name' (Boerne, TX, USA) and the Pamukkale travertines (Denizli, Turkey). The majority of the bacterial isolates obtained from these habitats belonged to the genera Sporosarcina, Brevundimonas, Sphingobacterium and Acinetobacter. The isolates were investigated for their capability to precipitate calcium carbonate and sequester CO2. Biotic and abiotic effects of CO2 sequestration during MICP were also investigated. In the biotic effect, we observed that the rate and concentration of CO2 sequestered was dependent on the species or strains. The main abiotic factors affecting CO2 sequestration during MICP were the pH and medium components. The increase in pH led to enhanced CO2 sequestration by the growth medium. The growth medium components, on the other hand, were shown to affect both the urease activity and CO2 sequestration. Through the Plackett-Burman experimental design, the most important growth medium component involved in CO2 sequestration was determined to be urea. The optimized medium composition by the Plackett-Burman design for each isolate led to a statistically significant increase, of up to 148.9%, in CO2 uptake through calcification mechanisms. © FEMS 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Southern Salish Sea Habitat Map Series: Admiralty Inlet
Cochrane, Guy R.; Dethier, Megan N.; Hodson, Timothy O.; Kull, Kristine K.; Golden, Nadine E.; Ritchie, Andrew C.; Moegling, Crescent; Pacunski, Robert E.; Cochrane, Guy R.
2015-01-01
This publication includes four map sheets, explanatory text, and a descriptive pamphlet. Each map sheet is published as a portable document format (PDF) file. ESRI ArcGIS compatible geotiffs (for example, bathymetry) and shapefiles (for example video observation points) will be available for download in the data catalog associated with this publication (Cochrane, 2015). An ArcGIS Project File with the symbology used to generate the map sheets is also provided. For those who do not own the full suite of ESRI GIS and mapping software, the data can be read using ESRI ArcReader, a free viewer that is available at http://www.esri.com/software/arcgis/arcreader/index.html.
Cross-platform validation and analysis environment for particle physics
NASA Astrophysics Data System (ADS)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
2017-11-01
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.
Big Science, Small-Budget Space Experiment Package Aka MISSE-5: A Hardware And Software Perspective
NASA Technical Reports Server (NTRS)
Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan
2007-01-01
Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.
David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.
HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis
David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057
Minerva: User-Centered Science Operations Software Capability for Future Human Exploration
NASA Technical Reports Server (NTRS)
Deans, Matthew; Marquez, Jessica J.; Cohen, Tamar; Miller, Matthew J.; Deliz, Ivonne; Hillenius, Steven; Hoffman, Jeffrey; Lee, Yeon Jin; Lees, David; Norheim, Johannes;
2017-01-01
In June of 2016, the Biologic Analog Science Associated with Lava Terrains (BASALT) research project conducted its first field deployment, which we call BASALT-1. BASALT-1 consisted of a science-driven field campaign in a volcanic field in Idaho as a simulated human mission to Mars. Scientists and mission operators were provided a suite of ground software tools that we refer to collectively as Minerva to carry out their work. Minerva provides capabilities for traverse planning and route optimization, timeline generation and display, procedure management, execution monitoring, data archiving, visualization, and search. This paper describes the Minerva architecture, constituent components, use cases, and some preliminary findings from the BASALT-1 campaign.
NASA Technical Reports Server (NTRS)
Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan
2005-01-01
Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.
HDTS 2017.1 Testing and Verification Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiteside, T.
2017-12-01
This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assurance documents including (Foley and Powell, 2010; Dixon, 2012; Whiteside, 2017b). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the field system performs within specifications. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test casesmore » to reproduce the defect and ensure that code changes correct the defect.« less
Implementation, reliability, and feasibility test of an Open-Source PACS.
Valeri, Gianluca; Zuccaccia, Matteo; Badaloni, Andrea; Ciriaci, Damiano; La Riccia, Luigi; Mazzoni, Giovanni; Maggi, Stefania; Giovagnoni, Andrea
2015-12-01
To implement a hardware and software system able to perform the major functions of an Open-Source PACS, and to analyze it in a simulated real-world environment. A small home network was implemented, and the Open-Source operating system Ubuntu 11.10 was installed in a laptop containing the Dcm4chee suite with the software devices needed. The Open-Source PACS implemented is compatible with Linux OS, Microsoft OS, and Mac OS X; furthermore, it was used with operating systems that guarantee the operation in portable devices (smartphone, tablet) Android and iOS. An OSS PACS is useful for making tutorials and workshops on post-processing techniques for educational and training purposes.
Using benchmarks for radiation testing of microprocessors and FPGAs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather; Robinson, William H.; Rech, Paolo
Performance benchmarks have been used over the years to compare different systems. These benchmarks can be useful for researchers trying to determine how changes to the technology, architecture, or compiler affect the system's performance. No such standard exists for systems deployed into high radiation environments, making it difficult to assess whether changes in the fabrication process, circuitry, architecture, or software affect reliability or radiation sensitivity. In this paper, we propose a benchmark suite for high-reliability systems that is designed for field-programmable gate arrays and microprocessors. As a result, we describe the development process and report neutron test data for themore » hardware and software benchmarks.« less
Using benchmarks for radiation testing of microprocessors and FPGAs
Quinn, Heather; Robinson, William H.; Rech, Paolo; ...
2015-12-17
Performance benchmarks have been used over the years to compare different systems. These benchmarks can be useful for researchers trying to determine how changes to the technology, architecture, or compiler affect the system's performance. No such standard exists for systems deployed into high radiation environments, making it difficult to assess whether changes in the fabrication process, circuitry, architecture, or software affect reliability or radiation sensitivity. In this paper, we propose a benchmark suite for high-reliability systems that is designed for field-programmable gate arrays and microprocessors. As a result, we describe the development process and report neutron test data for themore » hardware and software benchmarks.« less
Erik Nelson; Stephen Polasky; David J. Lewis; Andrew J. Plantinga; Eric Lonsdorf; Denis White; David Bael; Joshua Lawler
2008-01-01
We develop an integrated model to predict private land-use decisions in response to policy incentives designed to increase the provision of carbon sequestration and species conservation across heterogeneous landscapes. Using data from the Willamette Basin, Oregon, we compare the provision of carbon sequestration and species conservation under five simple policies that...
Today's PTA Advocate: Speak Up to Stop Sequestration
ERIC Educational Resources Information Center
Chevalier, Jacque
2012-01-01
The word sequestration has been in the news lately when talking about the federal budget. Sequestration refers to across-the-board cuts, and depending on where one lives and the amount of federal aid one's community receives, those cuts could amount to as much as 17 percent. That spells bad news for schools unless parents, educators, and other…
Chris A. Maier; Kurt H. Johnsen
2010-01-01
Intensive pine plantation management may provide opportunities to increase carbon sequestration in the Southeastern United States. Developing management options that increase fiber production and soil carbon sequestration require an understanding of the biological and edaphic processes that control soil carbon turnover. Belowground carbon resides primarily in three...
Soil carbon sequestration potential in semi-arid grasslands in the conservation reserve program
USDA-ARS?s Scientific Manuscript database
The Conservation Reserve Program (CRP) in the USA plays a major role in carbon (C) sequestration to help mitigate rising CO2 levels and climate change. The Southern High Plains (SHP) region contains N900.000 ha enrolled in CRP, but a regionally specific C sequestration rate has not been studied, and...
Using silviculture to influence carbon sequestration in southern Appalachian spruce-fir forests
Patrick T. Moore; R. Justin DeRose; James N. Long; Helga van Miegroet
2012-01-01
Enhancement of forest growth through silvicultural modification of stand density is one strategy for increasing carbon (C) sequestration. Using the Fire and Fuels Extension of the Forest Vegetation Simulator, the effects of even-aged, uneven-aged and no-action management scenarios on C sequestration in a southern Appalachian red spruce-Fraser fir forest were modeled....
Litynski, John T; Klara, Scott M; McIlvried, Howard G; Srivastava, Rameshwar D
2006-01-01
This paper reviews the Regional Carbon Sequestration Partnerships (RCSP) concept, which is a first attempt to bring the U.S. Department of Energy's (DOE) carbon sequestration program activities into the "real world" by using a geographically-disposed-system type approach for the U.S. Each regional partnership is unique and covers a unique section of the U.S. and is tasked with determining how the research and development activities of DOE's carbon sequestration program can best be implemented in their region of the country. Although there is no universal agreement on the cause, it is generally understood that global warming is occurring, and many climate scientists believe that this is due, in part, to the buildup of carbon dioxide (CO(2)) in the atmosphere. This is evident from the finding presented in the National Academy of Science Report to the President on Climate Change which stated "Greenhouse gases are accumulating in Earth's atmosphere as a result of human activities, causing surface air temperatures and subsurface ocean temperatures to rise. Temperatures are, in fact, rising. The changes observed over the last several decades are likely mostly due to human activities, ...". In the United States, emissions of CO(2) originate mainly from the combustion of fossil fuels for energy production, transportation, and other industrial processes. Roughly one third of U.S. anthropogenic CO(2) emissions come from power plants. Reduction of CO(2) emissions through sequestration of carbon either in geologic formations or in terrestrial ecosystems can be part of the solution to the problem of global warming. However, a number of steps must be accomplished before sequestration can become a reality. Cost effective capture and separation technology must be developed, tested, and demonstrated; a database of potential sequestration sites must be established; and techniques must be developed to measure, monitor, and verify the sequestered CO(2). Geographical differences in fossil fuel use, the industries present, and potential sequestration sinks across the United States dictate the use of a regional approach to address the sequestration of CO(2). To accommodate these differences, the DOE has created a nationwide network of seven Regional Carbon Sequestration Partnerships (RCSP) to help determine and implement the carbon sequestration technologies, infrastructure, and regulations most appropriate to promote CO(2) sequestration in different regions of the nation. These partnerships currently represent 40 states, three Indian Nations, four Canadian Provinces, and over 200 organizations, including academic institutions, research institutions, coal companies, utilities, equipment manufacturers, forestry and agricultural representatives, state and local governments, non-governmental organizations, and national laboratories. These partnerships are dedicated to developing the necessary infrastructure and validating the carbon sequestration technologies that have emerged from DOE's core R&D and other programs to mitigate emissions of CO(2), a potent greenhouse gas. The partnerships provide a critical link to DOE's plans for FutureGen, a highly efficient and technologically sophisticated coal-fired power plant that will produce both hydrogen and electricity with near-zero emissions. Though limited to the situation in the U.S., the paper describes for the international scientific community the approach being taken by the U.S. to prepare for carbon sequestration, should that become necessary.
Carbon sequestration potential estimates with changes in land use and tillage practice in Ohio, USA
Tan, Z.; Lal, R.
2005-01-01
Soil C sequestration through changes in land use and management is one of the important strategies to mitigate the global greenhouse effect. This study was conducted to estimate C sequestration potential of the top 20 cm depth of soil for two scenarios in Ohio, USA: (1) with reforestation of both current cropland and grassland where SOC pools are less than the baseline SOC pool under current forest; (2) with the adoption of NT on all current cropland. Based on Ohio Soil Survey Characterization Database and long-term experimental data of paired conservation tillage (CT) versus no-till (NT), we specified spatial variations of current SOC pools and C sequestration potentials associated with soil taxa within each major land resource area (MLRA). For scenario I, there would be 4.56 Mha of cropland having an average SOC sequestration capacity of 1.55 kg C m−2 and 0.80 Mha of grassland with that of 1.35 kg C m−2. Of all potential area, 73% are associated with Alfisols and 15% with Mollisols, but the achievable potential could vary significantly with individual MLRAs. Alternately, an average SOC sequestration rate of 62 g C m−2 year−1 was estimated with conversion from CT to NT for cultivated Alfisols, by which a cumulative increase of 71 Tg C resulted from reforestation of cropland could be realized in 25 years. Soils with lower antecedent C contents have higher C sequestration rates. In comparison with the results obtained at the state scale, the estimates of SOC sequestration potentials taxonomically associated with each specific MLRA may be more useful to the formulation of C credit trading programs.
Mewes, André; Hensen, Bennet; Wacker, Frank; Hansen, Christian
2017-02-01
In this article, we systematically examine the current state of research of systems that focus on touchless human-computer interaction in operating rooms and interventional radiology suites. We further discuss the drawbacks of current solutions and underline promising technologies for future development. A systematic literature search of scientific papers that deal with touchless control of medical software in the immediate environment of the operation room and interventional radiology suite was performed. This includes methods for touchless gesture interaction, voice control and eye tracking. Fifty-five research papers were identified and analyzed in detail including 33 journal publications. Most of the identified literature (62 %) deals with the control of medical image viewers. The others present interaction techniques for laparoscopic assistance (13 %), telerobotic assistance and operating room control (9 % each) as well as for robotic operating room assistance and intraoperative registration (3.5 % each). Only 8 systems (14.5 %) were tested in a real clinical environment, and 7 (12.7 %) were not evaluated at all. In the last 10 years, many advancements have led to robust touchless interaction approaches. However, only a few have been systematically evaluated in real operating room settings. Further research is required to cope with current limitations of touchless software interfaces in clinical environments. The main challenges for future research are the improvement and evaluation of usability and intuitiveness of touchless human-computer interaction and the full integration into productive systems as well as the reduction of necessary interaction steps and further development of hands-free interaction.
NEDE: an open-source scripting suite for developing experiments in 3D virtual environments.
Jangraw, David C; Johri, Ansh; Gribetz, Meron; Sajda, Paul
2014-09-30
As neuroscientists endeavor to understand the brain's response to ecologically valid scenarios, many are leaving behind hyper-controlled paradigms in favor of more realistic ones. This movement has made the use of 3D rendering software an increasingly compelling option. However, mastering such software and scripting rigorous experiments requires a daunting amount of time and effort. To reduce these startup costs and make virtual environment studies more accessible to researchers, we demonstrate a naturalistic experimental design environment (NEDE) that allows experimenters to present realistic virtual stimuli while still providing tight control over the subject's experience. NEDE is a suite of open-source scripts built on the widely used Unity3D game development software, giving experimenters access to powerful rendering tools while interfacing with eye tracking and EEG, randomizing stimuli, and providing custom task prompts. Researchers using NEDE can present a dynamic 3D virtual environment in which randomized stimulus objects can be placed, allowing subjects to explore in search of these objects. NEDE interfaces with a research-grade eye tracker in real-time to maintain precise timing records and sync with EEG or other recording modalities. Python offers an alternative for experienced programmers who feel comfortable mastering and integrating the various toolboxes available. NEDE combines many of these capabilities with an easy-to-use interface and, through Unity's extensive user base, a much more substantial body of assets and tutorials. Our flexible, open-source experimental design system lowers the barrier to entry for neuroscientists interested in developing experiments in realistic virtual environments. Copyright © 2014 Elsevier B.V. All rights reserved.
Yang, Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu, Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A.
2011-01-01
Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods:DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing∕registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research. PMID:21361176
Test Driven Development of Scientific Models
NASA Technical Reports Server (NTRS)
Clune, Thomas L.
2012-01-01
Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.
Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base
NASA Technical Reports Server (NTRS)
Bryant, Richard B., Jr.; Carrelli, David J.
2006-01-01
The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.
LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors
NASA Astrophysics Data System (ADS)
Snider, E. L.; Petrillo, G.
2017-10-01
LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.
Identification of a Platelet Membrane Glycoprotein as a Falciparum Malaria Sequestration Receptor
NASA Astrophysics Data System (ADS)
Ockenhouse, Christian F.; Tandon, Narendra N.; Magowan, Cathleen; Jamieson, G. A.; Chulay, Jeffrey D.
1989-03-01
Infections with the human malaria parasite Plasmodium falciparum are characterized by sequestration of erythrocytes infected with mature forms of the parasite. Sequestration of infected erythrocytes appears to be critical for survival of the parasite and to mediate immunopathological abnormalities in severe malaria. A leukocyte differentiation antigen (CD36) was previously suggested to have a role in sequestration of malaria-infected erythrocytes. CD36 was purified from platelets, where it is known as GPIV, and was shown to be a receptor for binding of infected erythrocytes. Infected erythrocytes adhered to CD36 immobilized on plastic; purified CD36 exhibited saturable, specific binding to infected erythrocytes; and purified CD36 or antibodies to CD36 inhibited and reversed binding of infected erythrocytes to cultured endothelial cells and melanoma cells in vitro. The portion of the CD36 molecule that reverses cytoadherence may be useful therapeutically for rapid reversal of sequestration in cerebral malaria.
Submicron structures provide preferential spots for carbon and nitrogen sequestration in soils
Vogel, Cordula; Mueller, Carsten W.; Höschen, Carmen; Buegger, Franz; Heister, Katja; Schulz, Stefanie; Schloter, Michael; Kögel-Knabner, Ingrid
2014-01-01
The sequestration of carbon and nitrogen by clay-sized particles in soils is well established, and clay content or mineral surface area has been used to estimate the sequestration potential of soils. Here, via incubation of a sieved (<2 mm) topsoil with labelled litter, we find that only some of the clay-sized surfaces bind organic matter (OM). Surprisingly, <19% of the visible mineral areas show an OM attachment. OM is preferentially associated with organo-mineral clusters with rough surfaces. By combining nano-scale secondary ion mass spectrometry and isotopic tracing, we distinguish between new labelled and pre-existing OM and show that new OM is preferentially attached to already present organo-mineral clusters. These results, which provide evidence that only a limited proportion of the clay-sized surfaces contribute to OM sequestration, revolutionize our view of carbon sequestration in soils and the widely used carbon saturation estimates. PMID:24399306