Sample records for processing system modaps

  1. Producing Global Science Products for the Moderate Resolution Imaging Spectroradiometer (MODIS) in MODAPS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward J.; Tilmes, Curt A.; Ye, Gang; Devine, Neal; Smith, David E. (Technical Monitor)

    2000-01-01

    The MODerate resolution Imaging Spectroradiometer (MODIS) was launched on NASA's EOS-Terra spacecraft December 1999. With 36 spectral bands covering the visible, near wave and short wave infrared. MODIS produces over 40 global science data products, including sea surface temperature, ocean color, cloud properties, vegetation indices land surface temperature and land cover change. The MODIS Data Processing System (MODAPS) produces 400 GB/day of global MODIS science products from calibrated radiances generated in the Earth Observing System Data and Information System (EOSDIS). The science products are shipped to the EOSDIS for archiving and distribution to the public. An additional 200 GB of products are shipped each day to MODIS team members for quality assurance and validation of their products. In the sections that follow, we will describe the architecture of the MODAPS, identify processing bottlenecks encountered in scaling MODAPS from 50 GB/day backup system to a 400 GB/day production system and discuss how these were handled.

  2. Uncoupling File System Components for Bridging Legacy and Modern Storage Architectures

    NASA Astrophysics Data System (ADS)

    Golpayegani, N.; Halem, M.; Tilmes, C.; Prathapan, S.; Earp, D. N.; Ashkar, J. S.

    2016-12-01

    Long running Earth Science projects can span decades of architectural changes in both processing and storage environments. As storage architecture designs change over decades such projects need to adjust their tools, systems, and expertise to properly integrate such new technologies with their legacy systems. Traditional file systems lack the necessary support to accommodate such hybrid storage infrastructure resulting in more complex tool development to encompass all possible storage architectures used for the project. The MODIS Adaptive Processing System (MODAPS) and the Level 1 and Atmospheres Archive and Distribution System (LAADS) is an example of a project spanning several decades which has evolved into a hybrid storage architecture. MODAPS/LAADS has developed the Lightweight Virtual File System (LVFS) which ensures a seamless integration of all the different storage architectures, including standard block based POSIX compliant storage disks, to object based architectures such as the S3 compliant HGST Active Archive System, and the Seagate Kinetic disks utilizing the Kinetic Protocol. With LVFS, all analysis and processing tools used for the project continue to function unmodified regardless of the underlying storage architecture enabling MODAPS/LAADS to easily integrate any new storage architecture without the costly need to modify existing tools to utilize such new systems. Most file systems are designed as a single application responsible for using metadata to organizing the data into a tree, determine the location for data storage, and a method of data retrieval. We will show how LVFS' unique approach of treating these components in a loosely coupled fashion enables it to merge different storage architectures into a single uniform storage system which bridges the underlying hybrid architecture.

  3. A Disk-Based System for Producing and Distributing Science Products from MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael

    2007-01-01

    Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.

  4. Quality Assessment of Collection 6 MODIS Atmospheric Science Products

    NASA Astrophysics Data System (ADS)

    Manoharan, V. S.; Ridgway, B.; Platnick, S. E.; Devadiga, S.; Mauoka, E.

    2015-12-01

    Since the launch of the NASA Terra and Aqua satellites in December 1999 and May 2002, respectively, atmosphere and land data acquired by the MODIS (Moderate Resolution Imaging Spectroradiometer) sensor on-board these satellites have been reprocessed five times at the MODAPS (MODIS Adaptive Processing System) located at NASA GSFC. The global land and atmosphere products use science algorithms developed by the NASA MODIS science team investigators. MODAPS completed Collection 6 reprocessing of MODIS Atmosphere science data products in April 2015 and is currently generating the Collection 6 products using the latest version of the science algorithms. This reprocessing has generated one of the longest time series of consistent data records for understanding cloud, aerosol, and other constituents in the earth's atmosphere. It is important to carefully evaluate and assess the quality of this data and remove any artifacts to maintain a useful climate data record. Quality Assessment (QA) is an integral part of the processing chain at MODAPS. This presentation will describe the QA approaches and tools adopted by the MODIS Land/Atmosphere Operational Product Evaluation (LDOPE) team to assess the quality of MODIS operational Atmospheric products produced at MODAPS. Some of the tools include global high resolution images, time series analysis and statistical QA metrics. The new high resolution global browse images with pan and zoom have provided the ability to perform QA of products in real time through synoptic QA on the web. This global browse generation has been useful in identifying production error, data loss, and data quality issues from calibration error, geolocation error and algorithm performance. A time series analysis for various science datasets in the Level-3 monthly product was recently developed for assessing any long term drifts in the data arising from instrument errors or other artifacts. This presentation will describe and discuss some test cases from the recently processed C6 products. We will also describe the various tools and approaches developed to verify and assess the algorithm changes implemented by the science team to address known issues in the products and improve the quality of the products.

  5. The Development of Two Science Investigator-led Processing Systems (SIPS) for NASA's Earth Observation System (EOS)

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt

    2004-01-01

    In 2001, NASA Goddard Space Flight Center's Laboratory for Terrestrial Physics started the construction of a science Investigator-led Processing System (SIPS) for processing data from the Ozone Monitoring Instrument (OMI) which will launch on the Aura platform in mid 2004. The Ozone Monitoring Instrument (OMI) is a contribution of the Netherlands Agency for Aerospace Programs (NIVR) in collaboration with the Finnish Meteorological Institute (FMI) to the Earth Observing System (EOS) Aura mission. It will continue the Total Ozone Monitoring System (TOMS) record for total ozone and other atmospheric parameters related to ozone chemistry and climate. OMI measurements will be highly synergistic with the other instruments on the EOS Aura platform. The LTP previously developed the Moderate Resolution Imaging Spectrometer (MODIS) Data Processing System (MODAPS), which has been in full operations since the launches of the Terra and Aqua spacecrafts in December, 1999 and May, 2002 respectively. During that time, it has continually evolved to better support the needs of the MODIS team. We now run multiple instances of the system managing faster than real time reprocessings of the data as well as continuing forward processing. The new OMI Data Processing System (OMIDAPS) was adapted from the MODAPS. It will ingest raw data from the satellite ground station and process it to produce calibrated, geolocated higher level data products. These data products will be transmitted to the Goddard Distributed Active Archive Center (GDAAC) instance of the Earth Observing System (EOS) Data and Information System (EOSDIS) for long term archive and distribution to the public. The OMIDAPS will also provide data distribution to the OMI Science Team for quality assessment, algorithm improvement, calibration, etc. We have taken advantage of lessons learned from the MODIS experience and software already developed for MODIS. We made some changes in the hardware system organization, database and software to adapt the system for OMI. We replaced the fundamental database system, Sybase, with an Open Source RDBMS called PostgreSQL, and based the entire OMIDAPS on a cluster of Linux based commodity computers rather than the large SGI servers that MODAPS uses. Rather than relying on a central I/O server host, the new system distributes its data archive among multiple server hosts in the cluster. OMI is also customizing the graphical user interfaces and reporting structure to more closely meet the needs of the OMI Science Team. Prior to 2003, simulated OMI data and the science algorithms were not ready for production testing. We initially constructed a prototype system and tested using a 25 year dataset of Total Ozone Mapping Spectrometer (TOMS) and Solar Backscatter Ultraviolet Instrument (SBUV) data. This prototype system provided a platform to support the adaptation of the algorithms for OMI, and provided reprocessing of the historical data aiding in its analysis. In a recent reanalysis of the TOMS data, the OMIDAPS processed 108,000 full orbits of data through 4 processing steps per orbit, producing about 800,000 files (400 GiB) of level 2 and greater data files. More recently we have installed two instances of the OMIDAPS for integration and testing of OM1 science processes as they get delivered from the Science Team. A Test instance of the OMIDAPS has also supported a series of "Interface Confidence Tests" (ICTs) and End-to-End Ground System tests to ensure the launch readiness of the system. This paper will discuss the high-level hardware, software, and database organization of the OMIDAPS and how it builds on the MODAPS heritage system. It will also provide an overview of the testing and implementation of the production OMIDAPS.

  6. Global Agricultural Monitoring (GLAM) using MODAPS and LANCE Data Products

    NASA Astrophysics Data System (ADS)

    Anyamba, A.; Pak, E. E.; Majedi, A. H.; Small, J. L.; Tucker, C. J.; Reynolds, C. A.; Pinzon, J. E.; Smith, M. M.

    2012-12-01

    The Global Inventory Modeling and Mapping Studies / Global Agricultural Monitoring (GIMMS GLAM) system is a web-based geographic application that offers Moderate Resolution Imaging Spectroradiometer (MODIS) imagery and user interface tools to data query and plot MODIS NDVI time series. The system processes near real-time and science quality Terra and Aqua MODIS 8-day composited datasets. These datasets are derived from the MOD09 and MYD09 surface reflectance products which are generated and provided by NASA/GSFC Land and Atmosphere Near Real-time Capability for EOS (LANCE) and NASA/GSFC MODIS Adaptive Processing System (MODAPS). The GIMMS GLAM system is developed and provided by the NASA/GSFC GIMMS group for the U.S. Department of Agriculture / Foreign Agricultural Service / International Production Assessment Division (USDA/FAS/IPAD) Global Agricultural Monitoring project (GLAM). The USDA/FAS/IPAD mission is to provide objective, timely, and regular assessment of the global agricultural production outlook and conditions affecting global food security. This system was developed to improve USDA/FAS/IPAD capabilities for making operational quantitative estimates for crop production and yield estimates based on satellite-derived data. The GIMMS GLAM system offers 1) web map imagery including Terra & Aqua MODIS 8-day composited NDVI, NDVI percent anomaly, and SWIR-NIR-Red band combinations, 2) web map overlays including administrative and 0.25 degree Land Information System (LIS) shape boundaries, and crop land cover masks, and 3) user interface tools to select features, data query, plot, and download MODIS NDVI time series.

  7. Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS)

    NASA Technical Reports Server (NTRS)

    Masek, Jeffrey G.

    2006-01-01

    The Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) project is creating a record of forest disturbance and regrowth for North America from the Landsat satellite record, in support of the carbon modeling activities. LEDAPS relies on the decadal Landsat GeoCover data set supplemented by dense image time series for selected locations. Imagery is first atmospherically corrected to surface reflectance, and then change detection algorithms are used to extract disturbance area, type, and frequency. Reuse of the MODIS Land processing system (MODAPS) architecture allows rapid throughput of over 2200 MSS, TM, and ETM+ scenes. Initial ("Beta") surface reflectance products are currently available for testing, and initial continental disturbance products will be available by the middle of 2006.

  8. Production and Distribution of NASA MODIS Remote Sensing Products

    NASA Technical Reports Server (NTRS)

    Wolfe, Robert

    2007-01-01

    The two Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on-board NASA's Earth Observing System (EOS) Terra and Aqua satellites make key measurements for understanding the Earth's terrestrial ecosystems. Global time-series of terrestrial geophysical parameters have been produced from MODIS/Terra for over 7 years and for MODIS/Aqua for more than 4 1/2 years. These well calibrated instruments, a team of scientists and a large data production, archive and distribution systems have allowed for the development of a new suite of high quality product variables at spatial resolutions as fine as 250m in support of global change research and natural resource applications. This talk describes the MODIS Science team's products, with a focus on the terrestrial (land) products, the data processing approach and the process for monitoring and improving the product quality. The original MODIS science team was formed in 1989. The team's primary role is the development and implementation of the geophysical algorithms. In addition, the team provided feedback on the design and pre-launch testing of the instrument and helped guide the development of the data processing system. The key challenges the science team dealt with before launch were the development of algorithms for a new instrument and provide guidance of the large and complex multi-discipline processing system. Land, Ocean and Atmosphere discipline teams drove the processing system requirements, particularly in the area of the processing loads and volumes needed to daily produce geophysical maps of the Earth at resolutions as fine as 250 m. The processing system had to handle a large number of data products, large data volumes and processing loads, and complex processing requirements. Prior to MODIS, daily global maps from heritage instruments, such as Advanced Very High Resolution Radiometer (AVHRR), were not produced at resolutions finer than 5 km. The processing solution evolved into a combination of processing the lower level (Level 1) products and the higher level discipline specific Land and Atmosphere products in the MODIS Science Investigator Lead Processing System (SIPS), the MODIS Adaptive Processing System (MODAPS), and archive and distribution of the Land products to the user community by two of NASA s EOS Distributed Active Archive Centers (DAACs). Recently, a part of MODAPS, the Level 1 and Atmosphere Archive and Distribution System (LAADS), took over the role of archiving and distributing the Level 1 and Atmosphere products to the user community.

  9. Improvements to the MODIS Land Products in Collection Version 6

    NASA Astrophysics Data System (ADS)

    Wolfe, R. E.; Devadiga, S.; Masuoka, E. J.; Running, S. W.; Vermote, E.; Giglio, L.; Wan, Z.; Riggs, G. A.; Schaaf, C.; Myneni, R. B.; Friedl, M. A.; Wang, Z.; Sulla-menashe, D. J.; Zhao, M.

    2013-12-01

    The MODIS (Moderate Resolution Imaging Spectroradiometer) Adaptive Processing System (MODAPS), housed at the NASA Goddard Space Flight Center (GSFC), has been processing the earth view data acquired by the MODIS instrument aboard the Terra (EOS AM) and Aqua (EOS PM) satellites to generate suite of land and atmosphere data products using the science algorithms developed by the MODIS Science Team. These data products are used by diverse set of users in research and other applications from both government and non-government agencies around the world. These validated global products are also being used in interactive Earth system models able to predict global change accurately enough to assist policy makers in making sound decisions concerning the protection of our environment. Hence an increased emphasis is being placed on generation of high quality consistent data records from the MODIS data through reprocessing of the records using improved science algorithms. Since the launch of Terra in December 1999, MODIS land data records have been reprocessed four times. The Collection Version 6 (C6) reprocessing of MODIS Land and Atmosphere products is scheduled to start in Fall 2013 and is expected to complete in Spring 2014. This presentation will describe changes made to the C6 science algorithms to correct issues in the C5 products, additional improvements made to the products as deemed necessary by the data users and science teams, and new products introduced in this reprocessing. In addition to the improvements from product specific changes to algorithms, the C6 products will also see significant improvement in the calibration by the MODIS Calibration Science Team (MCST) of the C6 L1B Top of the Atmosphere (TOA) reflectance and radiance product, more accurate geolocation, and an improved Land Water mask. For the a priori land cover input, this reprocessing will use the multi-year land cover product generated with three years of MODIS data as input as opposed to one single land cover product used for the entire mission in the C5 reprocessing. The C6 products are expected to be released from the Distributed Active Archive Center (DAAC) soon after the reprocessing begins. To facilitate user acquaintance with products from the new version and independent evaluation of C6 by comparison of two versions, MODAPS plans to continue generation of products from both versions for at least a year after completion of the C6 reprocessing after which C5 processing will be discontinued.

  10. Viirs Land Science Investigator-Led Processing System

    NASA Astrophysics Data System (ADS)

    Devadiga, S.; Mauoka, E.; Roman, M. O.; Wolfe, R. E.; Kalb, V.; Davidson, C. C.; Ye, G.

    2015-12-01

    The objective of the NASA's Suomi National Polar Orbiting Partnership (S-NPP) Land Science Investigator-led Processing System (Land SIPS), housed at the NASA Goddard Space Flight Center (GSFC), is to produce high quality land products from the Visible Infrared Imaging Radiometer Suite (VIIRS) to extend the Earth System Data Records (ESDRs) developed from NASA's heritage Earth Observing System (EOS) Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the EOS Terra and Aqua satellites. In this paper we will present the functional description and capabilities of the S-NPP Land SIPS, including system development phases and production schedules, timeline for processing, and delivery of land science products based on coordination with the S-NPP Land science team members. The Land SIPS processing stream is expected to be operational by December 2016, generating land products either using the NASA science team delivered algorithms, or the "best-of" science algorithms currently in operation at NASA's Land Product Evaluation and Algorithm Testing Element (PEATE). In addition to generating the standard land science products through processing of the NASA's VIIRS Level 0 data record, the Land SIPS processing system is also used to produce a suite of near-real time products for NASA's application community. Land SIPS will also deliver the standard products, ancillary data sets, software and supporting documentation (ATBDs) to the assigned Distributed Active Archive Centers (DAACs) for archival and distribution. Quality assessment and validation will be an integral part of the Land SIPS processing system; the former being performed at Land Data Operational Product Evaluation (LDOPE) facility, while the latter under the auspices of the CEOS Working Group on Calibration & Validation (WGCV) Land Product Validation (LPV) Subgroup; adopting the best-practices and tools used to assess the quality of heritage EOS-MODIS products generated at the MODIS Adaptive Processing System (MODAPS).

  11. Equivalent Sensor Radiance Generation and Remote Sensing from Model Parameters. Part 1; Equivalent Sensor Radiance Formulation

    NASA Technical Reports Server (NTRS)

    Wind, Galina; DaSilva, Arlindo M.; Norris, Peter M.; Platnick, Steven E.

    2013-01-01

    In this paper we describe a general procedure for calculating equivalent sensor radiances from variables output from a global atmospheric forecast model. In order to take proper account of the discrepancies between model resolution and sensor footprint the algorithm takes explicit account of the model subgrid variability, in particular its description of the probably density function of total water (vapor and cloud condensate.) The equivalent sensor radiances are then substituted into an operational remote sensing algorithm processing chain to produce a variety of remote sensing products that would normally be produced from actual sensor output. This output can then be used for a wide variety of purposes such as model parameter verification, remote sensing algorithm validation, testing of new retrieval methods and future sensor studies. We show a specific implementation using the GEOS-5 model, the MODIS instrument and the MODIS Adaptive Processing System (MODAPS) Data Collection 5.1 operational remote sensing cloud algorithm processing chain (including the cloud mask, cloud top properties and cloud optical and microphysical properties products.) We focus on clouds and cloud/aerosol interactions, because they are very important to model development and improvement.

  12. Multi-sensor Cloud Retrieval Simulator and Remote Sensing from Model Parameters . Pt. 1; Synthetic Sensor Radiance Formulation; [Synthetic Sensor Radiance Formulation

    NASA Technical Reports Server (NTRS)

    Wind, G.; DaSilva, A. M.; Norris, P. M.; Platnick, S.

    2013-01-01

    In this paper we describe a general procedure for calculating synthetic sensor radiances from variable output from a global atmospheric forecast model. In order to take proper account of the discrepancies between model resolution and sensor footprint, the algorithm takes explicit account of the model subgrid variability, in particular its description of the probability density function of total water (vapor and cloud condensate.) The simulated sensor radiances are then substituted into an operational remote sensing algorithm processing chain to produce a variety of remote sensing products that would normally be produced from actual sensor output. This output can then be used for a wide variety of purposes such as model parameter verification, remote sensing algorithm validation, testing of new retrieval methods and future sensor studies.We show a specific implementation using the GEOS-5 model, the MODIS instrument and the MODIS Adaptive Processing System (MODAPS) Data Collection 5.1 operational remote sensing cloud algorithm processing chain (including the cloud mask, cloud top properties and cloud optical and microphysical properties products). We focus on clouds because they are very important to model development and improvement.

  13. Federated Giovanni

    NASA Technical Reports Server (NTRS)

    Lynnes, C.

    2014-01-01

    Federated Giovanni is a NASA-funded ACCESS project to extend the scope of the GES DISC Giovanni online analysis tool to 4 other Distributed Active Archive Centers within EOSDIS: OBPG, LP-DAAC, MODAPS and PO.DAAC. As such, it represents a significant instance of sharing technology across the DAACs. We also touch on several sub-areas that are also sharable, such as Giovanni URLs, workflows and OGC-accessible services.

  14. LVFS: A Scalable Petabye/Exabyte Data Storage System

    NASA Astrophysics Data System (ADS)

    Golpayegani, N.; Halem, M.; Masuoka, E. J.; Ye, G.; Devine, N. K.

    2013-12-01

    Managing petabytes of data with hundreds of millions of files is the first step necessary towards an effective big data computing and collaboration environment in a distributed system. We describe here the MODAPS LAADS Virtual File System (LVFS), a new storage architecture which replaces the previous MODAPS operational Level 1 Land Atmosphere Archive Distribution System (LAADS) NFS based approach to storing and distributing datasets from several instruments, such as MODIS, MERIS, and VIIRS. LAADS is responsible for the distribution of over 4 petabytes of data and over 300 million files across more than 500 disks. We present here the first LVFS big data comparative performance results and new capabilities not previously possible with the LAADS system. We consider two aspects in addressing inefficiencies of massive scales of data. First, is dealing in a reliable and resilient manner with the volume and quantity of files in such a dataset, and, second, minimizing the discovery and lookup times for accessing files in such large datasets. There are several popular file systems that successfully deal with the first aspect of the problem. Their solution, in general, is through distribution, replication, and parallelism of the storage architecture. The Hadoop Distributed File System (HDFS), Parallel Virtual File System (PVFS), and Lustre are examples of such file systems that deal with petabyte data volumes. The second aspect deals with data discovery among billions of files, the largest bottleneck in reducing access time. The metadata of a file, generally represented in a directory layout, is stored in ways that are not readily scalable. This is true for HDFS, PVFS, and Lustre as well. Recent experimental file systems, such as Spyglass or Pantheon, have attempted to address this problem through redesign of the metadata directory architecture. LVFS takes a radically different architectural approach by eliminating the need for a separate directory within the file system. The LVFS system replaces the NFS disk mounting approach of LAADS and utilizes the already existing highly optimized metadata database server, which is applicable to most scientific big data intensive compute systems. Thus, LVFS ties the existing storage system with the existing metadata infrastructure system which we believe leads to a scalable exabyte virtual file system. The uniqueness of the implemented design is not limited to LAADS but can be employed with most scientific data processing systems. By utilizing the Filesystem In Userspace (FUSE), a kernel module available in many operating systems, LVFS was able to replace the NFS system while staying POSIX compliant. As a result, the LVFS system becomes scalable to exabyte sizes owing to the use of highly scalable database servers optimized for metadata storage. The flexibility of the LVFS design allows it to organize data on the fly in different ways, such as by region, date, instrument or product without the need for duplication, symbolic links, or any other replication methods. We proposed here a strategic reference architecture that addresses the inefficiencies of scientific petabyte/exabyte file system access through the dynamic integration of the observing system's large metadata file.

  15. Federated Giovanni: A Distributed Web Service for Analysis and Visualization of Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    The Geospatial Interactive Online Visualization and Analysis Interface (Giovanni) is a popular tool for users of the Goddard Earth Sciences Data and Information Services Center (GES DISC) and has been in use for over a decade. It provides a wide variety of algorithms and visualizations to explore large remote sensing datasets without having to download the data and without having to write readers and visualizers for it. Giovanni is now being extended to enable its capabilities at other data centers within the Earth Observing System Data and Information System (EOSDIS). This Federated Giovanni will allow four other data centers to add and maintain their data within Giovanni on behalf of their user community. Those data centers are the Physical Oceanography Distributed Active Archive Center (PO.DAAC), MODIS Adaptive Processing System (MODAPS), Ocean Biology Processing Group (OBPG), and Land Processes Distributed Active Archive Center (LP DAAC). Three tiers are supported: Tier 1 (GES DISC-hosted) gives the remote data center a data management interface to add and maintain data, which are provided through the Giovanni instance at the GES DISC. Tier 2 packages Giovanni up as a virtual machine for distribution to and deployment by the other data centers. Data variables are shared among data centers by sharing documents from the Solr database that underpins Giovanni's data management capabilities. However, each data center maintains their own instance of Giovanni, exposing the variables of most interest to their user community. Tier 3 is a Shared Source model, in which the data centers cooperate to extend the infrastructure by contributing source code.

  16. Spatial and Temporal Distribution of Cloud Properties Observed by MODIS: Preliminary Level-3 Results from the Collection 5 Reprocessing

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Hubanks, Paul; Pincus, Robert

    2006-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) was developed by NASA and launched onboard the Terra spacecraft on December 18, 1999 and Aqua spacecraft on May 4, 2002. It achieved its final orbit and began Earth observations on February 24, 2000 for Terra and June 24, 2002 for Aqua. A comprehensive set of operational algorithms for the retrieval of cloud physical and optical properties (optical thickness, effective particle radius, water path, thermodynamic phase) have recently been updated and are being used in the new "Collection 5" processing stream being produced by the MODIS Adaptive Processing System (MODAPS) at NASA GSFC. All Terra and Aqua data are undergoing Collection 5 reprocessing with an expected completion date by the end of 2006. The archived products from these algorithms include 1 km pixel-level (Level-2) and global gridded Level-3 products. The cloud products have applications in climate change studies, climate modeling, numerical weather prediction, as well as fundamental atmospheric research. In this talk, we will summarize the available Level-3 cloud properties and their associated statistical data sets, and show preliminary Terra and Aqua results from the available Collection 5 reprocessing effort. Anticipated results include the latitudinal distribution of cloud optical and radiative properties for both liquid water and ice clouds, as well as joint histograms of cloud optical thickness and effective radius for selected geographical locations around the world.

  17. Atmospheric Correction at AERONET Locations: A New Science and Validation Data Set

    NASA Technical Reports Server (NTRS)

    Wang, Yujie; Lyapustin, Alexei; Privette, Jeffery L.; Morisette, Jeffery T.; Holben, Brent

    2008-01-01

    This paper describes an AERONET-based Surface Reflectance Validation Network (ASRVN) and its dataset of spectral surface bidirectional reflectance and albedo based on MODIS TERRA and AQUA data. The ASRVN is an operational data collection and processing system. It receives 50x50 square kilometer subsets of MODIS L1B data from MODAPS and AERONET aerosol and water vapor information. Then it performs an accurate atmospheric correction for about 100 AERONET sites based on accurate radiative transfer theory with high quality control of the input data. The ASRVN processing software consists of L1B data gridding algorithm, a new cloud mask algorithm based on a time series analysis, and an atmospheric correction algorithm. The atmospheric correction is achieved by fitting the MODIS top of atmosphere measurements, accumulated for 16-day interval, with theoretical reflectance parameterized in terms of coefficients of the LSRT BRF model. The ASRVN takes several steps to ensure high quality of results: 1) cloud mask algorithm filters opaque clouds; 2) an aerosol filter has been developed to filter residual semi-transparent and sub-pixel clouds, as well as cases with high inhomogeneity of aerosols in the processing area; 3) imposing requirement of consistency of the new solution with previously retrieved BRF and albedo; 4) rapid adjustment of the 16-day retrieval to the surface changes using the last day of measurements; and 5) development of seasonal back-up spectral BRF database to increase data coverage. The ASRVN provides a gapless or near-gapless coverage for the processing area. The gaps, caused by clouds, are filled most naturally with the latest solution for a given pixels. The ASRVN products include three parameters of LSRT model (k(sup L), k(sup G), k(sup V)), surface albedo, NBRF (a normalized BRF computed for a standard viewing geometry, VZA=0 deg., SZA=45 deg.), and IBRF (instantaneous, or one angle, BRF value derived from the last day of MODIS measurement for specific viewing geometry) for MODIS 500m bands 1-7. The results are produced daily at resolution of 1 km in gridded format. We also provide cloud mask, quality flag and a browse bitmap image. The new dataset can be used for a wide range of applications including validation analysis and science research.

  18. The Generation of Near-Real Time Data Products for MODIS

    NASA Astrophysics Data System (ADS)

    Teague, M.; Schmaltz, J. E.; Ilavajhala, S.; Ye, G.; Masuoka, E.; Murphy, K. J.; Michael, K.

    2010-12-01

    The GSFC Terrestrial Information Systems Branch (614.5) operate the Land and Atmospheres Near-real-time Capability for EOS (LANCE-MODIS) system. Other LANCE elements include -AIRS, -MLS, -OMI, and -AMSR-E. LANCE-MODIS incorporates the former Rapid Response system and will, in early 2011, include the Fire Information for Resource Management System (FIRMS). The purpose of LANCE is to provide applications users with a variety of products on a near-real time basis. The LANCE-MODIS data products include Level 1 (L1), L2 fire, snow, sea ice, cloud mask/profiles, aerosols, clouds, land surface reflectance, land surface temperature, and L2G and L3 gridded, daily, land surface reflectance products. Data are available either by ftp access (pull) or by subscription (push) and the L1 and L2 data products are available within an average of 2.5 hours of the observation time. The use of ancillary data products input to the standard science algorithms has been modified in order to obtain these latencies. The resulting products have been approved for applications use by the MODIS Science Team. The http://lance.nasa.gov site provides registration information and extensive information concerning the MODIS data products and imagery including a comparison between the LANCE-MODIS and the standard science-quality products generated by the MODAPS system. The LANCE-MODIS system includes a variety of tools that enable users to manipulate the data products including: parameter, band, and geographic subsetting, re-projection, mosaicing, and generation of data in the GeoTIFF format. In most instances the data resulting from use of these tools has a latency of less than 3 hours. Access to these tools is available through a Web Coverage Service. A Google Earth/Web Mapping Service is available to access image products. LANCE-MODIS supports a wide variety of applications users in civilian, military, and foreign agencies as well as universities and the private sector. Examples of applications are: Flood Mapping, Famine relief, Food and Agriculture, Hazards and Disasters, and Weather.

  19. Multi-sensor cloud and aerosol retrieval simulator and remote sensing from model parameters - Part 2: Aerosols

    NASA Astrophysics Data System (ADS)

    Wind, Galina; da Silva, Arlindo M.; Norris, Peter M.; Platnick, Steven; Mattoo, Shana; Levy, Robert C.

    2016-07-01

    The Multi-sensor Cloud Retrieval Simulator (MCRS) produces a "simulated radiance" product from any high-resolution general circulation model with interactive aerosol as if a specific sensor such as the Moderate Resolution Imaging Spectroradiometer (MODIS) were viewing a combination of the atmospheric column and land-ocean surface at a specific location. Previously the MCRS code only included contributions from atmosphere and clouds in its radiance calculations and did not incorporate properties of aerosols. In this paper we added a new aerosol properties module to the MCRS code that allows users to insert a mixture of up to 15 different aerosol species in any of 36 vertical layers.This new MCRS code is now known as MCARS (Multi-sensor Cloud and Aerosol Retrieval Simulator). Inclusion of an aerosol module into MCARS not only allows for extensive, tightly controlled testing of various aspects of satellite operational cloud and aerosol properties retrieval algorithms, but also provides a platform for comparing cloud and aerosol models against satellite measurements. This kind of two-way platform can improve the efficacy of model parameterizations of measured satellite radiances, allowing the assessment of model skill consistently with the retrieval algorithm. The MCARS code provides dynamic controls for appearance of cloud and aerosol layers. Thereby detailed quantitative studies of the impacts of various atmospheric components can be controlled.In this paper we illustrate the operation of MCARS by deriving simulated radiances from various data field output by the Goddard Earth Observing System version 5 (GEOS-5) model. The model aerosol fields are prepared for translation to simulated radiance using the same model subgrid variability parameterizations as are used for cloud and atmospheric properties profiles, namely the ICA technique. After MCARS computes modeled sensor radiances equivalent to their observed counterparts, these radiances are presented as input to operational remote-sensing algorithms.Specifically, the MCARS-computed radiances are input into the processing chain used to produce the MODIS Data Collection 6 aerosol product (M{O/Y}D04). The M{O/Y}D04 product is of course normally produced from M{O/Y}D021KM MODIS Level-1B radiance product directly acquired by the MODIS instrument. MCARS matches the format and metadata of a M{O/Y}D021KM product. The resulting MCARS output can be directly provided to MODAPS (MODIS Adaptive Processing System) as input to various operational atmospheric retrieval algorithms. Thus the operational algorithms can be tested directly without needing to make any software changes to accommodate an alternative input source.We show direct application of this synthetic product in analysis of the performance of the MOD04 operational algorithm. We use biomass-burning case studies over Amazonia employed in a recent Working Group on Numerical Experimentation (WGNE)-sponsored study of aerosol impacts on numerical weather prediction (Freitas et al., 2015). We demonstrate that a known low bias in retrieved MODIS aerosol optical depth appears to be due to a disconnect between actual column relative humidity and the value assumed by the MODIS aerosol product.

  20. Multi-Sensor Cloud and Aerosol Retrieval Simulator and Remote Sensing from Model Parameters . Part 2; Aerosols

    NASA Technical Reports Server (NTRS)

    Wind, Galina; Da Silva, Arlindo M.; Norris, Peter M.; Platnick, Steven; Mattoo, Shana; Levy, Robert C.

    2016-01-01

    The Multi-sensor Cloud Retrieval Simulator (MCRS) produces a simulated radiance product from any high-resolution general circulation model with interactive aerosol as if a specific sensor such as the Moderate Resolution Imaging Spectroradiometer (MODIS) were viewing a combination of the atmospheric column and land ocean surface at a specific location. Previously the MCRS code only included contributions from atmosphere and clouds in its radiance calculations and did not incorporate properties of aerosols. In this paper we added a new aerosol properties module to the MCRS code that allows users to insert a mixture of up to 15 different aerosol species in any of 36 vertical layers. This new MCRS code is now known as MCARS (Multi-sensor Cloud and Aerosol Retrieval Simulator). Inclusion of an aerosol module into MCARS not only allows for extensive, tightly controlled testing of various aspects of satellite operational cloud and aerosol properties retrieval algorithms, but also provides a platform for comparing cloud and aerosol models against satellite measurements. This kind of two-way platform can improve the efficacy of model parameterizations of measured satellite radiances, allowing the assessment of model skill consistently with the retrieval algorithm. The MCARS code provides dynamic controls for appearance of cloud and aerosol layers. Thereby detailed quantitative studies of the impacts of various atmospheric components can be controlled. In this paper we illustrate the operation of MCARS by deriving simulated radiances from various data field output by the Goddard Earth Observing System version 5 (GEOS-5) model. The model aerosol fields are prepared for translation to simulated radiance using the same model sub grid variability parameterizations as are used for cloud and atmospheric properties profiles, namely the ICA technique. After MCARS computes modeled sensor radiances equivalent to their observed counterparts, these radiances are presented as input to operational remote-sensing algorithms. Specifically, the MCARS-computed radiances are input into the processing chain used to produce the MODIS Data Collection 6 aerosol product (MOYD04). TheMOYD04 product is of course normally produced from MOYD021KM MODIS Level-1B radiance product directly acquired by the MODIS instrument. MCARS matches the format and metadata of a MOYD021KM product. The resulting MCARS output can be directly provided to MODAPS (MODIS Adaptive Processing System) as input to various operational atmospheric retrieval algorithms. Thus the operational algorithms can be tested directly without needing to make any software changes to accommodate an alternative input source. We show direct application of this synthetic product in analysis of the performance of the MOD04 operational algorithm. We use biomass-burning case studies over Amazonia employed in a recent Working Group on Numerical Experimentation (WGNE)-sponsored study of aerosol impacts on numerical weather prediction (Freitas et al., 2015). We demonstrate that a known low bias in retrieved MODIS aerosol optical depth appears to be due to a disconnect between actual column relative humidity and the value assumed by the MODIS aerosol product.

  1. The Use of LANCE Imagery Products to Investigate Hazards and Disasters

    NASA Astrophysics Data System (ADS)

    Schmaltz, J. E.; Teague, M.; Conover, H.; Regner, K.; Masuoka, E.; Vollmer, B. E.; Durbin, P.; Murphy, K. J.; Boller, R. A.; Davies, D.; Ilavajhala, S.; Thompson, C. K.; Bingham, A.; Rao, S.

    2011-12-01

    The NASA/GSFC Land Atmospheres Near-real time Capability for EOS (LANCE) has endeavored to integrate a variety of products from the Terra, Aqua, and Aura missions to assist in meeting the needs of the applications user community. This community has a need for imagery products to support the investigation of a wide variety of phenomena including hazards and disasters. The Evjafjallajokull eruption, the tsunamis/flood in Japan, and the Gulf of Mexico oil spill are recent examples of applications benefiting from the timely and synoptic view afforded by LANCE data. Working with the instrument science teams and the applications community, LANCE has identified 14 applications categories and the LANCE products that will support their investigation. The categories are: Smoke Plumes, Ash Plumes, Dust Storms, Pollution, Severe Storms, Shipping hazards, Fishery hazards, Land Transportation, Fires, Floods, Drought, Vegetation, Agriculture, and Oil Spills. Forty products from AMSR-E, MODIS, AIRS, and OMI have been identified to support analyses and investigations of these phenomena. In each case multiple products from two or more instruments are available which gives a more complete picture of the evolving hazard or disaster. All Level 2 (L2) products are available within 2.5 hours of observation at the spacecraft and the daily L3 products are updated incrementally as new data become available. LANCE provides user access to imagery using two systems: a Web Mapping Service (WMS) and a Google Earth-based interface known as the State of the Earth (SOTE). The latter has resulted from a partnership between LANCE and the Physical Oceanography Distributed Active Archive Center (PO DAAC). When the user selects one of the 14 categories, the relevant products are established within the WMS (http://lance2.modaps.eosdis.nasa.gov/wms/). For each application, population density data are available for densities in excess of 100 people/sqkm with user-defined opacity. These data are provided by the EOSDIS Socio-Economic Data and Applications Center (SEDAC). Certain users may not want to be constrained by the pre-defined categories and related products and all 40 products may be added as potential overlays. The most recent 10 days of near-real time data are available through the WMS. The SOTE provides an interface to the products grouped in the same fashion as the WMS. The SOTE servers stream imagery and data in the OGC KML format and these feeds can be visualized through the Google Earth browser plug-in. SOTE provides visualization through a virtual globe environment by allowing users to interact with the globe via zooming, rotating, and tilting.

  2. An Integrated Approach for Accessing Multiple Datasets through LANCE

    NASA Astrophysics Data System (ADS)

    Murphy, K. J.; Teague, M.; Conover, H.; Regner, K.; Beaumont, B.; Masuoka, E.; Vollmer, B.; Theobald, M.; Durbin, P.; Michael, K.; Boller, R. A.; Schmaltz, J. E.; Davies, D.; Horricks, K.; Ilavajhala, S.; Thompson, C. K.; Bingham, A.

    2011-12-01

    The NASA/GSFC Land Atmospheres Near-real time Capability for EOS (LANCE) provides imagery for approximately 40 data products from MODIS, AIRS, AMSR-E and OMI to support the applications community in the study of a variety of phenomena. Thirty-six of these products are available within 2.5 hours of observation at the spacecraft. The data set includes the population density data provided by the EOSDIS Socio-Economic Data and Applications Center (SEDAC). The purpose of this paper is to describe the variety of tools that have been developed by LANCE to support user access to the imagery. The long-standing Rapid Response system has been integrated into LANCE and is a major vehicle for the distribution of the imagery to end users. There are presently approximately 10,000 anonymous users per month accessing these imagery. The products are grouped into 14 applications categories such as Smoke Plumes, Pollution, Fires, Agriculture and the selection of any category will make relevant subsets of the 40 products available as possible overlays in an interactive Web Client utilizing Web Mapping Service (WMS) to support user investigations (http://lance2.modaps.eosdis.nasa.gov/wms/). For example, selecting Severe Storms will include 6 products for MODIS, OMI, AIRS, and AMSR-E plus the SEDAC population density data. The client and WMS were developed using open-source technologies such as OpenLayers and MapServer and provides a uniform, browser-based access to data products. All overlays are downloadable in PNG, JPEG, or GeoTiff form up to 200MB per request. The WMS was beta-tested with the user community and substantial performance improvements were made through the use of such techniques as tile-caching. LANCE established a partnership with Physical Oceanography Distributed Active Archive Center (PO DAAC) to develop an alternative presentation for the 40 data products known as the State of the Earth (SOTE). This provides a Google Earth-based interface to the products grouped in the same fashion as the WMS. The SOTE servers stream imagery and data in the OGC KML format and these feeds can be visualized through the Google Earth browser plug-in. SOTE provides visualization through a virtual globe environment by allowing users to interact with the globe via zooming, rotating, and tilting. In addition, SOTE also allows adding custom KML feeds. LANCE also provides datacasting feeds to facilitate user access to imagery for the 40 products and the related HDF-EOS products (available in a variety of formats). These XML-based data feeds contain data attribute and geolocation information, and metadata including an identification of the related application category. Users can subscribe to any feeds through the LANCE web site and use the PO DAAC Feed Reader to filter and view the content. The WMS, SOTE, and datacasting tools can be accessed through http://lance.nasa.gov.

  3. Forest fires and PM10 pollution: the March 2012 case in Northern Spain

    NASA Astrophysics Data System (ADS)

    Rasilla Álvarez, Domingo; García Codron, Juan Carlos; Carracedo Martín, Virginia

    2016-04-01

    Forest fires are one of the largest sources of particulate matter, carbon monoxide, volatile organic compounds and other pollutants at regional scale. They significantly impact on local air quality and human health, even far from their original sources. March 2012 was one of the largest fire activity late winter and early spring seasons across northern Spain and Portugal. Official statistics from the Spanish and Portuguese authorities show that, during that month, approximately 35.000 ha were burned, representing the top March season in Cantabria (N. Spain) and the northern distritos of Portugal since 1981, most of them occurring in the mountainous areas, as depicted from the FIRMS database (https://firms.modaps.eosdis.nasa.gov/). At the same time, an analysis of the pollution data (Airbase dataset; http://www.eea.europa.eu/) show an increase in PM10 average values and exceedences of the limit values across the same area simultaneously or immediately after the main fire activity episodes. A comprehensive analysis of this fire and pollution event was undertaken to analyze the possible contribution of forest fires and other sources of PM10 to the high levels of this pollutant at ground level. Besides statistics of fire activity, satellite "hot spots" and ground level pollution data, we have included in our analysis meteorological records (synoptic stations, upper air soundings), backtrajectories (http://ready.arl.noaa.gov/HYSPLIT.php) and dust forecast models (https://www.bsc.es/earth-sciences/mineral-dust/catalogo-datos-dust). The results show a good agreement between the spatial and temporal variability of the levels of PM10 and the direction of the pollution plumes downwind the forest fires. The activity was mostly concentrated during 3 events, the first one between February 25th to March 3rd; the second spanning from 10th to 17th, and the last one, the most severe of the three, at the end of March. The climatological background was favourable, because most of the Iberian Peninsula recorded severe moisture deficits at the end of the winter, as shown by the drought indices. At synoptic time scale, the episodes of generalized burning coincided with warmer and drier than usual conditions, although wind speed was low, in agreement with the prevailing stable atmosphere. Saharian dust advections seem to have an indirect contribution to the high levels of PM10, probably by resuspension of old air masses. Moreover, the possible advection of old polluted layers from Eastern Europe, through a European blocking circulation (cut off high), is also considered.

  4. Impact of Radio Frequency Identification (RFID) on the Marine Corps’ Supply Process

    DTIC Science & Technology

    2006-09-01

    Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................56 3. As-Is: The Current... Processing System Vice a Batch Order Processing System ................58 V. RESULTS ................................................69 A. SIMULATION...Time: Hypothetical Improvement Using a Real-Time Order Processing System Vice a Batch Order Processing System ................71 3. As-Is: The

  5. [Dual process in large number estimation under uncertainty].

    PubMed

    Matsumuro, Miki; Miwa, Kazuhisa; Terai, Hitoshi; Yamada, Kento

    2016-08-01

    According to dual process theory, there are two systems in the mind: an intuitive and automatic System 1 and a logical and effortful System 2. While many previous studies about number estimation have focused on simple heuristics and automatic processes, the deliberative System 2 process has not been sufficiently studied. This study focused on the System 2 process for large number estimation. First, we described an estimation process based on participants’ verbal reports. The task, corresponding to the problem-solving process, consisted of creating subgoals, retrieving values, and applying operations. Second, we investigated the influence of such deliberative process by System 2 on intuitive estimation by System 1, using anchoring effects. The results of the experiment showed that the System 2 process could mitigate anchoring effects.

  6. System and method for deriving a process-based specification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  7. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an automated application processing and information retrieval system (APIRS), or the system, means a system of...

  8. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an automated application processing and information retrieval system (APIRS), or the system, means a system of...

  9. 45 CFR 205.35 - Mechanized claims processing and information retrieval systems; definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... claims processing and information retrieval systems; definitions. Section 205.35 through 205.38 contain...: (a) A mechanized claims processing and information retrieval system, hereafter referred to as an automated application processing and information retrieval system (APIRS), or the system, means a system of...

  10. 21 CFR 111.55 - What are the requirements to implement a production and process control system?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... production and process control system? 111.55 Section 111.55 Food and Drugs FOOD AND DRUG ADMINISTRATION... to Establish a Production and Process Control System § 111.55 What are the requirements to implement a production and process control system? You must implement a system of production and process...

  11. Intelligent process development of foam molding for the Thermal Protection System (TPS) of the space shuttle external tank

    NASA Technical Reports Server (NTRS)

    Bharwani, S. S.; Walls, J. T.; Jackson, M. E.

    1987-01-01

    A knowledge based system to assist process engineers in evaluating the processability and moldability of poly-isocyanurate (PIR) formulations for the thermal protection system of the Space Shuttle external tank (ET) is discussed. The Reaction Injection Molding- Process Development Advisor (RIM-PDA) is a coupled system which takes advantage of both symbolic and numeric processing techniques. This system will aid the process engineer in identifying a startup set of mold schedules and in refining the mold schedules to remedy specific process problems diagnosed by the system.

  12. A Model of the Base Civil Engineering Work Request/Work Order Processing System.

    DTIC Science & Technology

    1979-09-01

    changes to the work order processing system. This research identifies the variables that significantly affect the accomplishment time and proposes a... order processing system and its behavior with respect to work order processing time. A conceptual model was developed to describe the work request...work order processing system as a stochastic queueing system in which the processing times and the various distributions are treated as random variables

  13. Defining and reconstructing clinical processes based on IHE and BPMN 2.0.

    PubMed

    Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef

    2011-01-01

    This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.

  14. Working on the Boundaries: Philosophies and Practices of the Design Process

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.

    1996-01-01

    While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.

  15. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Basile, Lisa R.; Kelly, Angelita C.

    1987-01-01

    The Spacelab Data Processing Facility (SLDPF) is an integral part of the Space Shuttle data network for missions that involve attached scientific payloads. Expert system prototypes were developed to aid in the performance of the quality assurance function of the Spacelab and/or Attached Shuttle Payloads processed telemetry data. The Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS), two expert systems, were developed to determine their feasibility and potential in the quality assurance of processed telemetry data. The capabilities and performance of these systems are discussed.

  16. 75 FR 71376 - Simplified Network Application Processing System, On-Line Registration and Account Maintenance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-23

    ...-02] RIN 0694-AE98 Simplified Network Application Processing System, On-Line Registration and Account...'') electronically via BIS's Simplified Network Application Processing (SNAP-R) system. Currently, parties must... Network Applications Processing System (SNAP-R) in October 2006. The SNAP-R system provides a Web based...

  17. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  18. Integrating system safety into the basic systems engineering process

    NASA Technical Reports Server (NTRS)

    Griswold, J. W.

    1971-01-01

    The basic elements of a systems engineering process are given along with a detailed description of what the safety system requires from the systems engineering process. Also discussed is the safety that the system provides to other subfunctions of systems engineering.

  19. AIRSAR Automated Web-based Data Processing and Distribution System

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  20. Systemic safety project selection tool.

    DOT National Transportation Integrated Search

    2013-07-01

    "The Systemic Safety Project Selection Tool presents a process for incorporating systemic safety planning into traditional safety management processes. The Systemic Tool provides a step-by-step process for conducting systemic safety analysis; conside...

  1. Design of distributed systems of hydrolithosphere processes management. A synthesis of distributed management systems

    NASA Astrophysics Data System (ADS)

    Pershin, I. M.; Pervukhin, D. A.; Ilyushin, Y. V.; Afanaseva, O. V.

    2017-10-01

    The paper considers an important problem of designing distributed systems of hydrolithosphere processes management. The control actions on the hydrolithosphere processes under consideration are implemented by a set of extractive wells. The article shows the method of defining the approximation links for description of the dynamic characteristics of hydrolithosphere processes. The structure of distributed regulators, used in the management systems by the considered processes, is presented. The paper analyses the results of the synthesis of the distributed management system and the results of modelling the closed-loop control system by the parameters of the hydrolithosphere process.

  2. On-Line Real-Time Management Information Systems and Their Impact Upon User Personnel and Organizational Structure in Aviation Maintenance Activities.

    DTIC Science & Technology

    1979-12-01

    the functional management level, a real-time production con- trol system and an order processing system at the operational level. SIDMS was designed...at any one time. 26 An overview of the major software systems in operation is listed below: a. Major Software Systems: Order processing system e Order ... processing for the supply support center/AWP locker. e Order processing for the airwing squadron material controls. e Order processing for the IMA

  3. Low level image processing techniques using the pipeline image processing engine in the flight telerobotic servicer

    NASA Technical Reports Server (NTRS)

    Nashman, Marilyn; Chaconas, Karen J.

    1988-01-01

    The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.

  4. An Internationally Consented Standard for Nursing Process-Clinical Decision Support Systems in Electronic Health Records.

    PubMed

    Müller-Staub, Maria; de Graaf-Waar, Helen; Paans, Wolter

    2016-11-01

    Nurses are accountable to apply the nursing process, which is key for patient care: It is a problem-solving process providing the structure for care plans and documentation. The state-of-the art nursing process is based on classifications that contain standardized concepts, and therefore, it is named Advanced Nursing Process. It contains valid assessments, nursing diagnoses, interventions, and nursing-sensitive patient outcomes. Electronic decision support systems can assist nurses to apply the Advanced Nursing Process. However, nursing decision support systems are missing, and no "gold standard" is available. The study aim is to develop a valid Nursing Process-Clinical Decision Support System Standard to guide future developments of clinical decision support systems. In a multistep approach, a Nursing Process-Clinical Decision Support System Standard with 28 criteria was developed. After pilot testing (N = 29 nurses), the criteria were reduced to 25. The Nursing Process-Clinical Decision Support System Standard was then presented to eight internationally known experts, who performed qualitative interviews according to Mayring. Fourteen categories demonstrate expert consensus on the Nursing Process-Clinical Decision Support System Standard and its content validity. All experts agreed the Advanced Nursing Process should be the centerpiece for the Nursing Process-Clinical Decision Support System and should suggest research-based, predefined nursing diagnoses and correct linkages between diagnoses, evidence-based interventions, and patient outcomes.

  5. [Modality specific systems of representation and processing of information. Superfluous images, useful representations, necessary evil or inevitable consequences of optimal stimulus processing].

    PubMed

    Zimmer, H D

    1993-01-01

    It is discussed what is underlying the assumption of modality-specific processing systems and representations. Starting from the information processing approach relevant aspects of mental representations and their physiological realizations are discussed. Then three different forms of modality-specific systems are distinguished: as stimulus specific processing, as specific informational formats, and as modular part systems. Parallel to that three kinds of analogue systems are differentiated: as holding an analogue-relation, as having a specific informational format and as a set of specific processing constraints. These different aspects of the assumption of modality-specific systems are demonstrated in the example of visual and spatial information processing. It is concluded that postulating information-specific systems is not a superfluous assumption, but it is necessary, and even more likely it is an inevitable consequence of an optimization of stimulus processing.

  6. On the Risk Management and Auditing of SOA Based Business Processes

    NASA Astrophysics Data System (ADS)

    Orriens, Bart; Heuvel, Willem-Jan V./D.; Papazoglou, Mike

    SOA-enabled business processes stretch across many cooperating and coordinated systems, possibly crossing organizational boundaries, and technologies like XML and Web services are used for making system-to-system interactions commonplace. Business processes form the foundation for all organizations, and as such, are impacted by industry regulations. This requires organizations to review their business processes and ensure that they meet the compliance standards set forth in legislation. In this paper we sketch a SOA-based service risk management and auditing methodology including a compliance enforcement and verification system that assures verifiable business process compliance. This is done on the basis of a knowledge-based system that allows integration of internal control systems into business processes conform pre-defined compliance rules, monitor both the normal process behavior and those of the control systems during process execution, and log these behaviors to facilitate retrospective auditing.

  7. Containerless automated processing of intermetallic compounds and composites

    NASA Technical Reports Server (NTRS)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  8. Apparatus and Method for Assessing Vestibulo-Ocular Function

    NASA Technical Reports Server (NTRS)

    Shelhamer, Mark J. (Inventor)

    2015-01-01

    A system for assessing vestibulo-ocular function includes a motion sensor system adapted to be coupled to a user's head; a data processing system configured to communicate with the motion sensor system to receive the head-motion signals; a visual display system configured to communicate with the data processing system to receive image signals from the data processing system; and a gain control device arranged to be operated by the user and to communicate gain adjustment signals to the data processing system.

  9. Using ERP and WfM Systems for Implementing Business Processes: An Empirical Study

    NASA Astrophysics Data System (ADS)

    Aversano, Lerina; Tortorella, Maria

    Software systems mainly considered from enterprises for dealing with a business process automation belong to the following two categories: Workflow Management Systems (WfMS) and Enterprise Resource Planning (ERP) systems. The wider diffusion of ERP systems tends to favourite this solution, but there are several limitations of most ERP systems for automating business processes. This paper reports an empirical study aiming at comparing the ability of implementing business processes of ERP systems and WfMSs. Two different case studies have been considered in the empirical study. It evaluates and analyses the correctness and completeness of the process models implemented by using ERP and WfM systems.

  10. Image data processing system requirements study. Volume 1: Analysis. [for Earth Resources Survey Program

    NASA Technical Reports Server (NTRS)

    Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.

    1973-01-01

    Digital image processing, image recorders, high-density digital data recorders, and data system element processing for use in an Earth Resources Survey image data processing system are studied. Loading to various ERS systems is also estimated by simulation.

  11. Thermal Storage Process and Components Laboratory | Energy Systems

    Science.gov Websites

    Integration Facility | NREL Process and Components Laboratory Thermal Storage Process and Components Laboratory The Energy Systems Integration Facility's Thermal Systems Process and Components Laboratory supports research and development, testing, and evaluation of new thermal energy storage systems

  12. On Cognition, Structured Sequence Processing, and Adaptive Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Petersson, Karl Magnus

    2008-11-01

    Cognitive neuroscience approaches the brain as a cognitive system: a system that functionally is conceptualized in terms of information processing. We outline some aspects of this concept and consider a physical system to be an information processing device when a subclass of its physical states can be viewed as representational/cognitive and transitions between these can be conceptualized as a process operating on these states by implementing operations on the corresponding representational structures. We identify a generic and fundamental problem in cognition: sequentially organized structured processing. Structured sequence processing provides the brain, in an essential sense, with its processing logic. In an approach addressing this problem, we illustrate how to integrate levels of analysis within a framework of adaptive dynamical systems. We note that the dynamical system framework lends itself to a description of asynchronous event-driven devices, which is likely to be important in cognition because the brain appears to be an asynchronous processing system. We use the human language faculty and natural language processing as a concrete example through out.

  13. Automated method for the systematic interpretation of resonance peaks in spectrum data

    DOEpatents

    Damiano, B.; Wood, R.T.

    1997-04-22

    A method is described for spectral signature interpretation. The method includes the creation of a mathematical model of a system or process. A neural network training set is then developed based upon the mathematical model. The neural network training set is developed by using the mathematical model to generate measurable phenomena of the system or process based upon model input parameter that correspond to the physical condition of the system or process. The neural network training set is then used to adjust internal parameters of a neural network. The physical condition of an actual system or process represented by the mathematical model is then monitored by extracting spectral features from measured spectra of the actual process or system. The spectral features are then input into said neural network to determine the physical condition of the system or process represented by the mathematical model. More specifically, the neural network correlates the spectral features (i.e. measurable phenomena) of the actual process or system with the corresponding model input parameters. The model input parameters relate to specific components of the system or process, and, consequently, correspond to the physical condition of the process or system. 1 fig.

  14. Automated method for the systematic interpretation of resonance peaks in spectrum data

    DOEpatents

    Damiano, Brian; Wood, Richard T.

    1997-01-01

    A method for spectral signature interpretation. The method includes the creation of a mathematical model of a system or process. A neural network training set is then developed based upon the mathematical model. The neural network training set is developed by using the mathematical model to generate measurable phenomena of the system or process based upon model input parameter that correspond to the physical condition of the system or process. The neural network training set is then used to adjust internal parameters of a neural network. The physical condition of an actual system or process represented by the mathematical model is then monitored by extracting spectral features from measured spectra of the actual process or system. The spectral features are then input into said neural network to determine the physical condition of the system or process represented by the mathematical. More specifically, the neural network correlates the spectral features (i.e. measurable phenomena) of the actual process or system with the corresponding model input parameters. The model input parameters relate to specific components of the system or process, and, consequently, correspond to the physical condition of the process or system.

  15. IDAPS (Image Data Automated Processing System) System Description

    DTIC Science & Technology

    1988-06-24

    This document describes the physical configuration and components used in the image processing system referred to as IDAPS (Image Data Automated ... Processing System). This system was developed by the Environmental Research Institute of Michigan (ERIM) for Eglin Air Force Base. The system is designed

  16. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834 Access to records: Claims processing assessment systems. The agency, upon written request, must provide HHS staff with access to all records pertaining to its MQC claims processing assessment system reviews...

  17. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    NASA Astrophysics Data System (ADS)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  18. User's manual for the National Water Information System of the U.S. Geological Survey: Automated Data Processing System (ADAPS)

    USGS Publications Warehouse

    ,

    2003-01-01

    The Automated Data Processing System (ADAPS) was developed for the processing, storage, and retrieval of water data, and is part of the National Water Information System (NWIS) developed by the U.S. Geological Survey. NWIS is a distributed water database in which data can be processed over a network of computers at U.S. Geological Survey offices throughout the United States. NWIS comprises four subsystems: ADAPS, the Ground-Water Site Inventory System (GWSI), the Water-Quality System (QWDATA), and the Site-Specific Water-Use Data System (SWUDS). This section of the NWIS User's Manual describes the automated data processing of continuously recorded water data, which primarily are surface-water data; however, the system also allows for the processing of water-quality and ground-water data. This manual describes various components and features of the ADAPS, and provides an overview of the data processing system and a description of the system framework. The components and features included are: (1) data collection and processing, (2) ADAPS menus and programs, (3) command line functions, (4) steps for processing station records, (5) postprocessor programs control files, (6) the standard format for transferring and entering unit and daily values, and (7) relational database (RDB) formats.

  19. Overcoming Intuition: Metacognitive Difficulty Activates Analytic Reasoning

    ERIC Educational Resources Information Center

    Alter, Adam L.; Oppenheimer, Daniel M.; Epley, Nicholas; Eyre, Rebecca N.

    2007-01-01

    Humans appear to reason using two processing styles: System 1 processes that are quick, intuitive, and effortless and System 2 processes that are slow, analytical, and deliberate that occasionally correct the output of System 1. Four experiments suggest that System 2 processes are activated by metacognitive experiences of difficulty or disfluency…

  20. Application of the informational reference system OZhUR to the automated processing of data from satellites of the Kosmos series

    NASA Technical Reports Server (NTRS)

    Pokras, V. M.; Yevdokimov, V. P.; Maslov, V. D.

    1978-01-01

    The structure and potential of the information reference system OZhUR designed for the automated data processing systems of scientific space vehicles (SV) is considered. The system OZhUR ensures control of the extraction phase of processing with respect to a concrete SV and the exchange of data between phases.The practical application of the system OZhUR is exemplified in the construction of a data processing system for satellites of the Cosmos series. As a result of automating the operations of exchange and control, the volume of manual preparation of data is significantly reduced, and there is no longer any need for individual logs which fix the status of data processing. The system Ozhur is included in the automated data processing system Nauka which is realized in language PL-1 in a binary one-address system one-state (BOS OS) electronic computer.

  1. Design of penicillin fermentation process simulation system

    NASA Astrophysics Data System (ADS)

    Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi

    2011-10-01

    Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.

  2. Programmable partitioning for high-performance coherence domains in a multiprocessor system

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Salapura, Valentina [Chappaqua, NY

    2011-01-25

    A multiprocessor computing system and a method of logically partitioning a multiprocessor computing system are disclosed. The multiprocessor computing system comprises a multitude of processing units, and a multitude of snoop units. Each of the processing units includes a local cache, and the snoop units are provided for supporting cache coherency in the multiprocessor system. Each of the snoop units is connected to a respective one of the processing units and to all of the other snoop units. The multiprocessor computing system further includes a partitioning system for using the snoop units to partition the multitude of processing units into a plurality of independent, memory-consistent, adjustable-size processing groups. Preferably, when the processor units are partitioned into these processing groups, the partitioning system also configures the snoop units to maintain cache coherency within each of said groups.

  3. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  4. An intelligent factory-wide optimal operation system for continuous production process

    NASA Astrophysics Data System (ADS)

    Ding, Jinliang; Chai, Tianyou; Wang, Hongfeng; Wang, Junwei; Zheng, Xiuping

    2016-03-01

    In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

  5. Using a Radiofrequency Identification System for Improving the Patient Discharge Process: A Simulation Study.

    PubMed

    Shim, Sung J; Kumar, Arun; Jiao, Roger

    2016-01-01

    A hospital is considering deploying a radiofrequency identification (RFID) system and setting up a new "discharge lounge" to improve the patient discharge process. This study uses computer simulation to model and compare the current process and the new process, and it assesses the impact of the RFID system and the discharge lounge on the process in terms of resource utilization and time taken in the process. The simulation results regarding resource utilization suggest that the RFID system can slightly relieve the burden on all resources, whereas the RFID system and the discharge lounge together can significantly mitigate the nurses' tasks. The simulation results in terms of the time taken demonstrate that the RFID system can shorten patient wait times, staff busy times, and bed occupation times. The results of the study could prove helpful to others who are considering the use of an RFID system in the patient discharge process in hospitals or similar processes.

  6. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.

  7. 21 CFR 864.9900 - Cord blood processing system and storage container.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Cord blood processing system and storage container... Manufacture Human Cells, Tissues, and Cellular and Tissue-Based Products (HCT/Ps) § 864.9900 Cord blood processing system and storage container. (a) Identification. A cord blood processing system and storage...

  8. 21 CFR 864.9900 - Cord blood processing system and storage container.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Cord blood processing system and storage container... Manufacture Human Cells, Tissues, and Cellular and Tissue-Based Products (HCT/Ps) § 864.9900 Cord blood processing system and storage container. (a) Identification. A cord blood processing system and storage...

  9. 21 CFR 864.9900 - Cord blood processing system and storage container.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Cord blood processing system and storage container... Manufacture Human Cells, Tissues, and Cellular and Tissue-Based Products (HCT/Ps) § 864.9900 Cord blood processing system and storage container. (a) Identification. A cord blood processing system and storage...

  10. 21 CFR 864.9900 - Cord blood processing system and storage container.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Cord blood processing system and storage container... Manufacture Human Cells, Tissues, and Cellular and Tissue-Based Products (HCT/Ps) § 864.9900 Cord blood processing system and storage container. (a) Identification. A cord blood processing system and storage...

  11. 21 CFR 864.9900 - Cord blood processing system and storage container.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Cord blood processing system and storage container... Manufacture Human Cells, Tissues, and Cellular and Tissue-Based Products (HCT/Ps) § 864.9900 Cord blood processing system and storage container. (a) Identification. A cord blood processing system and storage...

  12. Development of integrated control system for smart factory in the injection molding process

    NASA Astrophysics Data System (ADS)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  13. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  14. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  15. Problems of Automation and Management Principles Information Flow in Manufacturing

    NASA Astrophysics Data System (ADS)

    Grigoryuk, E. N.; Bulkin, V. V.

    2017-07-01

    Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.

  16. A Recommended Framework for the Network-Centric Acquisition Process

    DTIC Science & Technology

    2009-09-01

    ISO /IEC 12207 , Systems and Software Engineering-Software Life-Cycle Processes  ANSI/EIA 632, Processes for Engineering a System. There are...engineering [46]. Some of the process models presented in the DAG are:  ISO /IEC 15288, Systems and Software Engineering-System Life-Cycle Processes...e.g., ISO , IA, Security, etc.). Vetting developers helps ensure that they are using industry best industry practices and maximize the IA compliance

  17. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  18. JPRS Report, Science & Technology, Japan, High Temperature Materials

    DTIC Science & Technology

    1990-11-09

    3 This restriction is heavy. The inconvenience that the material powder of the imido thermal composition method, for example, which shows the best...procedures, system composition , features of formability, and forming characteristic of forming samples using alumina material will be made. 2...Osmotic V Process Forming System 2.1 System Composition of Process A system block diagram of the process is shown in Figure 1. The V process forming system

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.

    This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less

  20. Instrumentation for optimizing an underground coal-gasification process

    NASA Astrophysics Data System (ADS)

    Seabaugh, W.; Zielinski, R. E.

    1982-06-01

    While the United States has a coal resource base of 6.4 trillion tons, only seven percent is presently recoverable by mining. The process of in-situ gasification can recover another twenty-eight percent of the vast resource, however, viable technology must be developed for effective in-situ recovery. The key to this technology is system that can optimize and control the process in real-time. An instrumentation system is described that optimizes the composition of the injection gas, controls the in-situ process and conditions the product gas for maximum utilization. The key elements of this system are Monsanto PRISM Systems, a real-time analytical system, and a real-time data acquisition and control system. This system provides from complete automation of the process but can easily be overridden by manual control. The use of this cost effective system can provide process optimization and is an effective element in developing a viable in-situ technology.

  1. On the use of multi-agent systems for the monitoring of industrial systems

    NASA Astrophysics Data System (ADS)

    Rezki, Nafissa; Kazar, Okba; Mouss, Leila Hayet; Kahloul, Laid; Rezki, Djamil

    2016-03-01

    The objective of the current paper is to present an intelligent system for complex process monitoring, based on artificial intelligence technologies. This system aims to realize with success all the complex process monitoring tasks that are: detection, diagnosis, identification and reconfiguration. For this purpose, the development of a multi-agent system that combines multiple intelligences such as: multivariate control charts, neural networks, Bayesian networks and expert systems has became a necessity. The proposed system is evaluated in the monitoring of the complex process Tennessee Eastman process.

  2. Systems and Methods for Radar Data Communication

    NASA Technical Reports Server (NTRS)

    Bunch, Brian (Inventor); Szeto, Roland (Inventor); Miller, Brad (Inventor)

    2013-01-01

    A radar information processing system is operable to process high bandwidth radar information received from a radar system into low bandwidth radar information that may be communicated to a low bandwidth connection coupled to an electronic flight bag (EFB). An exemplary embodiment receives radar information from a radar system, the radar information communicated from the radar system at a first bandwidth; processes the received radar information into processed radar information, the processed radar information configured for communication over a connection operable at a second bandwidth, the second bandwidth lower than the first bandwidth; and communicates the radar information from a radar system, the radar information communicated from the radar system at a first bandwidth.

  3. Development of the Diagnostic Expert System for Tea Processing

    NASA Astrophysics Data System (ADS)

    Yoshitomi, Hitoshi; Yamaguchi, Yuichi

    A diagnostic expert system for tea processing which can presume the cause of the defect of the processed tea was developed to contribute to the improvement of tea processing. This system that consists of some programs can be used through the Internet. The inference engine, the core of the system adopts production system which is well used on artificial intelligence, and is coded by Prolog as the artificial intelligence oriented language. At present, 176 rules for inference have been registered on this system. The system will be able to presume better if more rules are added to the system.

  4. Process Based on SysML for New Launchers System and Software Developments

    NASA Astrophysics Data System (ADS)

    Hiron, Emmanuel; Miramont, Philippe

    2010-08-01

    The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.

  5. Process control integration requirements for advanced life support systems applicable to manned space missions

    NASA Technical Reports Server (NTRS)

    Spurlock, Paul; Spurlock, Jack M.; Evanich, Peggy L.

    1991-01-01

    An overview of recent developments in process-control technology which might have applications in future advanced life support systems for long-duration space operations is presented. Consideration is given to design criteria related to control system selection and optimization, and process-control interfacing methodology. Attention is also given to current life support system process control strategies, innovative sensors, instrumentation and control, and innovations in process supervision.

  6. Aircraft Alerting Systems Standardization Study. Phase IV. Accident Implications on Systems Design.

    DTIC Science & Technology

    1982-06-01

    computing and processing to assimilate and process status informa- 5 tion using...provided with capabilities in computing and processing , sensing, interfacing, and controlling and displaying. 17 o Computing and Processing - Algorithms...alerting system to perform a flight status monitor function would require additional sensinq, computing and processing , interfacing, and controlling

  7. Passive serialization in a multitasking environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hennessey, J.P.; Osisek, D.L.; Seigh, J.W. II

    1989-02-28

    In a multiprocessing system having a control program in which data objects are shared among processes, this patent describes a method for serializing references to a data object by the processes so as to prevent invalid references to the data object by any process when an operation requiring exclusive access is performed by another process, comprising the steps of: permitting the processes to reference data objects on a shared access basis without obtaining a shared lock; monitoring a point of execution of the control program which is common to all processes in the system, which occurs regularly in the process'more » execution and across which no references to any data object can be maintained by any process, except references using locks; establishing a system reference point which occurs after each process in the system has passed the point of execution at least once since the last such system reference point; requesting an operation requiring exclusive access on a selected data object; preventing subsequent references by other processes to the selected data object; waiting until two of the system references points have occurred; and then performing the requested operation.« less

  8. 45 CFR 205.38 - Federal financial participation (FFP) for establishing a statewide mechanized system.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...

  9. 45 CFR 205.38 - Federal financial participation (FFP) for establishing a statewide mechanized system.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...

  10. 45 CFR 205.38 - Federal financial participation (FFP) for establishing a statewide mechanized system.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...

  11. 45 CFR 205.38 - Federal financial participation (FFP) for establishing a statewide mechanized system.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...

  12. 45 CFR 205.38 - Federal financial participation (FFP) for establishing a statewide mechanized system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...

  13. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  14. Polymer Waveguide Fabrication Techniques

    NASA Astrophysics Data System (ADS)

    Ramey, Delvan A.

    1985-01-01

    The ability of integrated optic systems to compete in signal processing aplications with more traditional analog and digital electronic systems is discussed. The Acousto-Optic Spectrum Analyzer is an example which motivated the particular work discussed herein. Provided real time processing is more critical than absolute accuracy, such integrated optic systems fulfill a design need. Fan-out waveguide arrays allow crosstalk in system detector arrays to be controlled without directly limiting system resolution. A polyurethane pattern definition process was developed in order to demonstrate fan-out arrays. This novel process is discussed, along with further research needs. Integrated optic system market penetration would be enhanced by development of commercial processes of this type.

  15. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  16. Logistics Control Facility: A Normative Model for Total Asset Visibility in the Air Force Logistics System

    DTIC Science & Technology

    1994-09-01

    IIssue Computers, information systems, and communication systems are being increasingly used in transportation, warehousing, order processing , materials...inventory levels, reduced order processing times, reduced order processing costs, and increased customer satisfaction. While purchasing and transportation...process, the speed in which crders are processed would increase significantly. Lowering the order processing time in turn lowers the lead time, which in

  17. Advanced Information Processing System (AIPS) proof-of-concept system functional design I/O network system services

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The function design of the Input/Output (I/O) services for the Advanced Information Processing System (AIPS) proof of concept system is described. The data flow diagrams, which show the functional processes in I/O services and the data that flows among them, are contained. A complete list of the data identified on the data flow diagrams and in the process descriptions are provided.

  18. Reprocessing system with nuclide separation based on chromatography in hydrochloric acid solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suzuki, Tatsuya; Tachibana, Yu; Koyama, Shi-ichi

    2013-07-01

    We have proposed the reprocessing system with nuclide separation processes based on the chromatographic technique in the hydrochloric acid solution system. Our proposed system consists of the dissolution process, the reprocessing process, the minor actinide separation process, and nuclide separation processes. In the reprocessing and separation processes, the pyridine resin is used as a main separation media. It was confirmed that the dissolution in the hydrochloric acid solution is easily achieved by the plasma voloxidation and by the addition of oxygen peroxide into the hydrochloric acid solution.

  19. A methodology for evaluation of an interactive multispectral image processing system

    NASA Technical Reports Server (NTRS)

    Kovalick, William M.; Newcomer, Jeffrey A.; Wharton, Stephen W.

    1987-01-01

    Because of the considerable cost of an interactive multispectral image processing system, an evaluation of a prospective system should be performed to ascertain if it will be acceptable to the anticipated users. Evaluation of a developmental system indicated that the important system elements include documentation, user friendliness, image processing capabilities, and system services. The criteria and evaluation procedures for these elements are described herein. The following factors contributed to the success of the evaluation of the developmental system: (1) careful review of documentation prior to program development, (2) construction and testing of macromodules representing typical processing scenarios, (3) availability of other image processing systems for referral and verification, and (4) use of testing personnel with an applications perspective and experience with other systems. This evaluation was done in addition to and independently of program testing by the software developers of the system.

  20. Prototype architecture for a VLSI level zero processing system. [Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Shi, Jianfei; Grebowsky, Gerald J.; Horner, Ward P.; Chesney, James R.

    1989-01-01

    The prototype architecture and implementation of a high-speed level zero processing (LZP) system are discussed. Due to the new processing algorithm and VLSI technology, the prototype LZP system features compact size, low cost, high processing throughput, and easy maintainability and increased reliability. Though extensive control functions have been done by hardware, the programmability of processing tasks makes it possible to adapt the system to different data formats and processing requirements. It is noted that the LZP system can handle up to 8 virtual channels and 24 sources with combined data volume of 15 Gbytes per orbit. For greater demands, multiple LZP systems can be configured in parallel, each called a processing channel and assigned a subset of virtual channels. The telemetry data stream will be steered into different processing channels in accordance with their virtual channel IDs. This super system can cope with a virtually unlimited number of virtual channels and sources. In the near future, it is expected that new disk farms with data rate exceeding 150 Mbps will be available from commercial vendors due to the advance in disk drive technology.

  1. Systems Thinking for the Enterprise: A Thought Piece

    NASA Astrophysics Data System (ADS)

    Rebovich, George

    This paper suggests a way of managing the acquisition of capabilities for large-scale government enterprises that is different from traditional "specify and build" approaches commonly employed by U.S. government agencies in acquiring individual systems or systems of systems (SoS). Enterprise capabilities evolve through the emergence and convergence of information and other technologies and their integration into social, institutional and operational organizations and processes. Enterprise capabilities evolve whether or not the enterprise has processes in place to actively manage them. Thus the critical role of enterprise system engineering (ESE) processes should be to shape, enhance and accelerate the "natural" evolution of enterprise capabilities. ESE processes do not replace or add a layer to traditional system engineering (TSE) processes used in developing individual systems or SoS. ESE processes should complement TSE processes by shaping outcome spaces and stimulating interactions among enterprise participants through marketlike mechanisms to reward those that create innovation which moves and accelerates the evolution of the enterprise.

  2. Allostasis and Addiction: Role of the Dopamine and Corticotropin-Releasing Factor Systems

    PubMed Central

    George, Olivier; Le Moal, Michel; Koob, George F.

    2011-01-01

    Allostasis, originally conceptualized to explain persistent morbidity of arousal and autonomic function, is defined as the process of achieving stability through physiological or behavioral change. Two types of biological processes have been proposed to describe the mechanisms underlying allostasis in drug addiction, a within-system adaptation and a between-system adaptation. In the within-system process, the drug elicits an opposing, neutralizing reaction within the same system in which the drug elicits its primary and unconditioned reinforcing actions, while in the between-system process, different neurobiological systems that the one initially activated by the drug are recruited. In this review, we will focus our interest on alterations in the dopaminergic and corticotropin releasing factor systems as within-system and between-system neuroadaptations respectively, that underlie the opponent process to drugs of abuse. We hypothesize that repeated compromised activity in the dopaminergic system and sustained activation of the CRF-CRF1R system with withdrawal episodes may lead to an allostatic load contributing significantly to the transition to drug addiction. PMID:22108506

  3. Recent developments in processing systems for cell and tissue cultures toward therapeutic application.

    PubMed

    Kino-oka, Masahiro; Taya, Masahito

    2009-10-01

    Innovative techniques of cell and tissue processing, based on tissue engineering, have been developed for therapeutic applications. Cell expansion and tissue reconstruction through ex vivo cultures are core processes used to produce engineered tissues with sufficient structural integrity and functionality. In manufacturing, strict management against contamination and human error is compelled due to direct use of un-sterilable products and the laboriousness of culture operations, respectively. Therefore, the development of processing systems for cell and tissue cultures is one of the critical issues for ensuring a stable process and quality of therapeutic products. However, the siting criterion of culture systems to date has not been made clear. This review article classifies some of the known processing systems into 'sealed-chamber' and 'sealed-vessel' culture systems based on the difference in their aseptic spaces, and describes the potential advantages of these systems and current states of culture systems, especially those established by Japanese companies. Moreover, on the basis of the guidelines for isolator systems used in aseptic processing for healthcare products, which are issued by the International Organization for Standardization, the siting criterion of the processing systems for cells and tissue cultures is discussed in perspective of manufacturing therapeutic products in consideration of the regulations according to the Good Manufacturing Practice.

  4. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  5. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  6. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  7. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  8. Tailoring Systems Engineering Processes in a Conceptual Design Environment: A Case Study at NASA Marshall Spaceflight Center's ACO

    NASA Technical Reports Server (NTRS)

    Mulqueen, John; Maples, C. Dauphne; Fabisinski, Leo, III

    2012-01-01

    This paper provides an overview of Systems Engineering as it is applied in a conceptual design space systems department at the National Aeronautics and Space Administration (NASA) Marshall Spaceflight Center (MSFC) Advanced Concepts Office (ACO). Engineering work performed in the NASA MFSC's ACO is targeted toward the Exploratory Research and Concepts Development life cycle stages, as defined in the International Council on Systems Engineering (INCOSE) System Engineering Handbook. This paper addresses three ACO Systems Engineering tools that correspond to three INCOSE Technical Processes: Stakeholder Requirements Definition, Requirements Analysis, and Integration, as well as one Project Process Risk Management. These processes are used to facilitate, streamline, and manage systems engineering processes tailored for the earliest two life cycle stages, which is the environment in which ACO engineers work. The role of systems engineers and systems engineering as performed in ACO is explored in this paper. The need for tailoring Systems Engineering processes, tools, and products in the ever-changing engineering services ACO provides to its customers is addressed.

  9. Microphone Array Phased Processing System (MAPPS): Version 4.0 Manual

    NASA Technical Reports Server (NTRS)

    Watts, Michael E.; Mosher, Marianne; Barnes, Michael; Bardina, Jorge

    1999-01-01

    A processing system has been developed to meet increasing demands for detailed noise measurement of individual model components. The Microphone Array Phased Processing System (MAPPS) uses graphical user interfaces to control all aspects of data processing and visualization. The system uses networked parallel computers to provide noise maps at selected frequencies in a near real-time testing environment. The system has been successfully used in the NASA Ames 7- by 10-Foot Wind Tunnel.

  10. A conceptual persistent healthcare quality improvement process for software development management.

    PubMed

    Lin, Jen-Chiun; Su, Mei-Ju; Cheng, Po-Hsun; Weng, Yung-Chien; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei

    2007-01-01

    This paper illustrates a sustained conceptual service quality improvement process for the management of software development within a healthcare enterprise. Our proposed process is revised from Niland's healthcare quality information system (HQIS). This process includes functions to survey the satisfaction of system functions, describe the operation bylaws on-line, and provide on-demand training. To achieve these goals, we integrate five information systems in National Taiwan University Hospital, including healthcare information systems, health quality information system, requirement management system, executive information system, and digital learning system, to form a full Deming cycle. A preliminary user satisfaction survey showed that our outpatient information system scored an average of 71.31 in 2006.

  11. AVIRIS ground data-processing system

    NASA Technical Reports Server (NTRS)

    Reimer, John H.; Heyada, Jan R.; Carpenter, Steve C.; Deich, William T. S.; Lee, Meemong

    1987-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has been under development at JPL for the past four years. During this time, a dedicated ground data-processing system has been designed and implemented to store and process the large amounts of data expected. This paper reviews the objectives of this ground data-processing system and describes the hardware. An outline of the data flow through the system is given, and the software and incorporated algorithms developed specifically for the systematic processing of AVIRIS data are described.

  12. Long-term care information systems: an overview of the selection process.

    PubMed

    Nahm, Eun-Shim; Mills, Mary Etta; Feege, Barbara

    2006-06-01

    Under the current Medicare Prospective Payment System method and the ever-changing managed care environment, the long-term care information system is vital to providing quality care and to surviving in business. system selection process should be an interdisciplinary effort involving all necessary stakeholders for the proposed system. The system selection process can be modeled following the Systems Developmental Life Cycle: identifying problems, opportunities, and objectives; determining information requirements; analyzing system needs; designing the recommended system; and developing and documenting software.

  13. Industrial process system assessment: bridging process engineering and life cycle assessment through multiscale modeling.

    EPA Science Inventory

    The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...

  14. APPLEPIPS /Apple Personal Image Processing System/ - An interactive digital image processing system for the Apple II microcomputer

    NASA Technical Reports Server (NTRS)

    Masuoka, E.; Rose, J.; Quattromani, M.

    1981-01-01

    Recent developments related to microprocessor-based personal computers have made low-cost digital image processing systems a reality. Image analysis systems built around these microcomputers provide color image displays for images as large as 256 by 240 pixels in sixteen colors. Descriptive statistics can be computed for portions of an image, and supervised image classification can be obtained. The systems support Basic, Fortran, Pascal, and assembler language. A description is provided of a system which is representative of the new microprocessor-based image processing systems currently on the market. While small systems may never be truly independent of larger mainframes, because they lack 9-track tape drives, the independent processing power of the microcomputers will help alleviate some of the turn-around time problems associated with image analysis and display on the larger multiuser systems.

  15. System-Level Shared Governance Structures and Processes in Healthcare Systems With Magnet®-Designated Hospitals: A Descriptive Study.

    PubMed

    Underwood, Carlisa M; Hayne, Arlene N

    The purpose was to identify and describe structures and processes of best practices for system-level shared governance in healthcare systems. Currently, more than 64.6% of US community hospitals are part of a system. System chief nurse executives (SCNEs) are challenged to establish leadership structures and processes that effectively and efficiently disseminate best practices for patients and staff across complex organizations, geographically dispersed locations, and populations. Eleven US healthcare SCNEs from the American Nurses Credentialing Center's repository of Magnet®-designated facilities participated in a 35-multiquestion interview based on Kanter's Theory of Organizational Empowerment. Most SCNEs reported the presence of more than 50% of the empowerment structures and processes in system-level shared governance. Despite the difficulties and complexities of growing health systems, SCNEs have replicated empowerment characteristics of hospital shared governance structures and processes at the system level.

  16. Engineering Supply Management System: The Next Generation

    DTIC Science & Technology

    1991-09-01

    010 Partia! receipts 0018 Automatic inventory update 0 048 Discrepant material 0 004 Order processing requirements Transaction reversal capability 0 012...August 1991. 2-5 sys.em’s modules that support the DEH’s needs are the Sales Order Processing , Register Sales, Purchase Order Processing , Inventory...modular system developed by PIC Business Systems, Incorporated. This system possesses Order Processing , Inventory Management, Purchase Orders, and

  17. Event Processing and Variable Part of Sample Period Determining in Combined Systems Using GA

    NASA Astrophysics Data System (ADS)

    Strémy, Maximilián; Závacký, Pavol; Jedlička, Martin

    2011-01-01

    This article deals with combined dynamic systems and usage of modern techniques in dealing with these systems, focusing particularly on sampling period design, cyclic processing tasks and related processing algorithms in the combined event management systems using genetic algorithms.

  18. 42 CFR 431.832 - Reporting requirements for claims processing assessment systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... assessment systems. 431.832 Section 431.832 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... GENERAL ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.832 Reporting requirements for claims processing assessment systems. (a) The agency must submit...

  19. System safety management lessons learned from the US Army acquisition process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piatt, J.A.

    1989-05-01

    The Assistant Secretary of the Army for Research, Development and Acquisition directed the Army Safety Center to provide an audit of the causes of accidents and safety of use restrictions on recently fielded systems by tracking residual hazards back through the acquisition process. The objective was to develop lessons learned'' that could be applied to the acquisition process to minimize mishaps in fielded systems. System safety management lessons learned are defined as Army practices or policies, derived from past successes and failures, that are expected to be effective in eliminating or reducing specific systemic causes of residual hazards. They aremore » broadly applicable and supportive of the Army structure and acquisition objectives. Pacific Northwest Laboratory (PNL) was given the task of conducting an independent, objective appraisal of the Army's system safety program in the context of the Army materiel acquisition process by focusing on four fielded systems which are products of that process. These systems included the Apache helicopter, the Bradley Fighting Vehicle (BFV), the Tube Launched, Optically Tracked, Wire Guided (TOW) Missile and the High Mobility Multipurpose Wheeled Vehicle (HMMWV). The objective of this study was to develop system safety management lessons learned associated with the acquisition process. The first step was to identify residual hazards associated with the selected systems. Since it was impossible to track all residual hazards through the acquisition process, certain well-known, high visibility hazards were selected for detailed tracking. These residual hazards illustrate a variety of systemic problems. Systemic or process causes were identified for each residual hazard and analyzed to determine why they exist. System safety management lessons learned were developed to address related systemic causal factors. 29 refs., 5 figs.« less

  20. Hadoop-based implementation of processing medical diagnostic records for visual patient system

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo

    2018-03-01

    We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.

  1. Modeling Business Processes in Public Administration

    NASA Astrophysics Data System (ADS)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  2. An open system approach to process reengineering in a healthcare operational environment.

    PubMed

    Czuchry, A J; Yasin, M M; Norris, J

    2000-01-01

    The objective of this study is to examine the applicability of process reengineering in a healthcare operational environment. The intake process of a mental healthcare service delivery system is analyzed systematically to identify process-related problems. A methodology which utilizes an open system orientation coupled with process reengineering is utilized to overcome operational and patient related problems associated with the pre-reengineered intake process. The systematic redesign of the intake process resulted in performance improvements in terms of cost, quality, service and timing.

  3. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  4. Evolution of the archaeal and mammalian information processing systems: towards an archaeal model for human disease.

    PubMed

    Lyu, Zhe; Whitman, William B

    2017-01-01

    Current evolutionary models suggest that Eukaryotes originated from within Archaea instead of being a sister lineage. To test this model of ancient evolution, we review recent studies and compare the three major information processing subsystems of replication, transcription and translation in the Archaea and Eukaryotes. Our hypothesis is that if the Eukaryotes arose within the archaeal radiation, their information processing systems will appear to be one of kind and not wholly original. Within the Eukaryotes, the mammalian or human systems are emphasized because of their importance in understanding health. Biochemical as well as genetic studies provide strong evidence for the functional similarity of archaeal homologs to the mammalian information processing system and their dissimilarity to the bacterial systems. In many independent instances, a simple archaeal system is functionally equivalent to more elaborate eukaryotic homologs, suggesting that evolution of complexity is likely an central feature of the eukaryotic information processing system. Because fewer components are often involved, biochemical characterizations of the archaeal systems are often easier to interpret. Similarly, the archaeal cell provides a genetically and metabolically simpler background, enabling convenient studies on the complex information processing system. Therefore, Archaea could serve as a parsimonious and tractable host for studying human diseases that arise in the information processing systems.

  5. The analysis of magnesium oxide hydration in three-phase reaction system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Xiaojia; Guo, Lin; Chen, Chen

    In order to investigate the magnesium oxide hydration process in gas–liquid–solid (three-phase) reaction system, magnesium hydroxide was prepared by magnesium oxide hydration in liquid–solid (two-phase) and three-phase reaction systems. A semi-empirical model and the classical shrinking core model were used to fit the experimental data. The fitting result shows that both models describe well the hydration process of three-phase system, while only the semi-empirical model right for the hydration process of two-phase system. The characterization of the hydration product using X-Ray diffraction (XRD) and scanning electron microscope (SEM) was performed. The XRD and SEM show hydration process in the two-phasemore » system follows common dissolution/precipitation mechanism. While in the three-phase system, the hydration process undergo MgO dissolution, Mg(OH){sub 2} precipitation, Mg(OH){sub 2} peeling off from MgO particle and leaving behind fresh MgO surface. - Graphical abstract: There was existence of a peeling-off process in the gas–liquid–solid (three-phase) MgO hydration system. - Highlights: • Magnesium oxide hydration in gas–liquid–solid system was investigated. • The experimental data in three-phase system could be fitted well by two models. • The morphology analysis suggested that there was existence of a peel-off process.« less

  6. Proposed Framework for the Evaluation of Standalone Corpora Processing Systems: An Application to Arabic Corpora

    PubMed Central

    Al-Thubaity, Abdulmohsen; Alqifari, Reem

    2014-01-01

    Despite the accessibility of numerous online corpora, students and researchers engaged in the fields of Natural Language Processing (NLP), corpus linguistics, and language learning and teaching may encounter situations in which they need to develop their own corpora. Several commercial and free standalone corpora processing systems are available to process such corpora. In this study, we first propose a framework for the evaluation of standalone corpora processing systems and then use it to evaluate seven freely available systems. The proposed framework considers the usability, functionality, and performance of the evaluated systems while taking into consideration their suitability for Arabic corpora. While the results show that most of the evaluated systems exhibited comparable usability scores, the scores for functionality and performance were substantially different with respect to support for the Arabic language and N-grams profile generation. The results of our evaluation will help potential users of the evaluated systems to choose the system that best meets their needs. More importantly, the results will help the developers of the evaluated systems to enhance their systems and developers of new corpora processing systems by providing them with a reference framework. PMID:25610910

  7. Proposed framework for the evaluation of standalone corpora processing systems: an application to Arabic corpora.

    PubMed

    Al-Thubaity, Abdulmohsen; Al-Khalifa, Hend; Alqifari, Reem; Almazrua, Manal

    2014-01-01

    Despite the accessibility of numerous online corpora, students and researchers engaged in the fields of Natural Language Processing (NLP), corpus linguistics, and language learning and teaching may encounter situations in which they need to develop their own corpora. Several commercial and free standalone corpora processing systems are available to process such corpora. In this study, we first propose a framework for the evaluation of standalone corpora processing systems and then use it to evaluate seven freely available systems. The proposed framework considers the usability, functionality, and performance of the evaluated systems while taking into consideration their suitability for Arabic corpora. While the results show that most of the evaluated systems exhibited comparable usability scores, the scores for functionality and performance were substantially different with respect to support for the Arabic language and N-grams profile generation. The results of our evaluation will help potential users of the evaluated systems to choose the system that best meets their needs. More importantly, the results will help the developers of the evaluated systems to enhance their systems and developers of new corpora processing systems by providing them with a reference framework.

  8. Deployment of ERP Systems at Automotive Industries, Security Inspection (Case Study: IRAN KHODRO Automotive Company)

    NASA Astrophysics Data System (ADS)

    Ali, Hatamirad; Hasan, Mehrjerdi

    Automotive industry and car production process is one of the most complex and large-scale production processes. Today, information technology (IT) and ERP systems incorporates a large portion of production processes. Without any integrated systems such as ERP, the production and supply chain processes will be tangled. The ERP systems, that are last generation of MRP systems, make produce and sale processes of these industries easier and this is the major factor of development of these industries anyhow. Today many of large-scale companies are developing and deploying the ERP systems. The ERP systems facilitate many of organization processes and make organization to increase efficiency. The security is a very important part of the ERP strategy at the organization, Security at the ERP systems, because of integrity and extensive, is more important of local and legacy systems. Disregarding of this point can play a giant role at success or failure of this kind of systems. The IRANKHODRO is the biggest automotive factory in the Middle East with an annual production over 600.000 cars. This paper presents ERP security deployment experience at the "IRANKHODRO Company". Recently, by launching ERP systems, it moved a big step toward more developments.

  9. Power processing systems for ion thrusters.

    NASA Technical Reports Server (NTRS)

    Herron, B. G.; Garth, D. R.; Finke, R. C.; Shumaker, H. A.

    1972-01-01

    The proposed use of ion thrusters to fulfill various communication satellite propulsion functions such as east-west and north-south stationkeeping, attitude control, station relocation and orbit raising, naturally leads to the requirement for lightweight, efficient and reliable thruster power processing systems. Collectively, the propulsion requirements dictate a wide range of thruster power levels and operational lifetimes, which must be matched by the power processing. This paper will discuss the status of such power processing systems, present system design alternatives and project expected near future power system performance.

  10. Selective catalytic reduction system and process using a pre-sulfated zirconia binder

    DOEpatents

    Sobolevskiy, Anatoly; Rossin, Joseph A.

    2010-06-29

    A selective catalytic reduction (SCR) process with a palladium catalyst for reducing NOx in a gas, using hydrogen as a reducing agent is provided. The process comprises contacting the gas stream with a catalyst system, the catalyst system comprising (ZrO.sub.2)SO.sub.4, palladium, and a pre-sulfated zirconia binder. The inclusion of a pre-sulfated zirconia binder substantially increases the durability of a Pd-based SCR catalyst system. A system for implementing the disclosed process is further provided.

  11. The application of intelligent process control to space based systems

    NASA Technical Reports Server (NTRS)

    Wakefield, G. Steve

    1990-01-01

    The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligence process control system.

  12. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  13. Selected Systems Engineering Process Deficiencies and Their Consequences

    NASA Technical Reports Server (NTRS)

    Thomas, Lawrence Dale

    2006-01-01

    The systems engineering process is well established and well understood. While this statement could be argued in the light of the many systems engineering guidelines and that have been developed, comparative review of these respective descriptions reveal that they differ primarily in the number of discrete steps or other nuances, and are at their core essentially common. Likewise, the systems engineering textbooks differ primarily in the context for application of systems engineering or in the utilization of evolved tools and techniques, not in the basic method. Thus, failures in systems engineering cannot credibly be attributed to implementation of the wrong systems engineering process among alternatives. However, numerous systems failures can be attributed to deficient implementation of the systems engineering process. What may clearly be perceived as a system engineering deficiency in retrospect can appear to be a well considered system engineering efficiency in real time - an efficiency taken to reduce cost or meet a schedule, or more often both. Typically these efficiencies are grounded on apparently solid rationale, such as reuse of heritage hardware or software. Over time, unintended consequences of a systems engineering process deficiency may begin to be realized, and unfortunately often the consequence is system failure. This paper describes several actual cases of system failures that resulted from deficiencies in their systems engineering process implementation, including the Ariane 5 and the Hubble Space Telescope.

  14. Selected systems engineering process deficiencies and their consequences

    NASA Astrophysics Data System (ADS)

    Thomas, L. Dale

    2007-06-01

    The systems engineering process is well established and well understood. While this statement could be argued in the light of the many systems engineering guidelines and that have been developed, comparative review of these respective descriptions reveal that they differ primarily in the number of discrete steps or other nuances, and are at their core essentially common. Likewise, the systems engineering textbooks differ primarily in the context for application of systems engineering or in the utilization of evolved tools and techniques, not in the basic method. Thus, failures in systems engineering cannot credibly be attributed to implementation of the wrong systems engineering process among alternatives. However, numerous system failures can be attributed to deficient implementation of the systems engineering process. What may clearly be perceived as a systems engineering deficiency in retrospect can appear to be a well considered system engineering efficiency in real time—an efficiency taken to reduce cost or meet a schedule, or more often both. Typically these efficiencies are grounded on apparently solid rationale, such as reuse of heritage hardware or software. Over time, unintended consequences of a systems engineering process deficiency may begin to be realized, and unfortunately often the consequence is systems failure. This paper describes several actual cases of system failures that resulted from deficiencies in their systems engineering process implementation, including the Ariane 5 and the Hubble Space Telescope.

  15. Water recovery and solid waste processing for aerospace and domestic applications. Volume 1: Final report

    NASA Technical Reports Server (NTRS)

    Murray, R. W.

    1973-01-01

    A comprehensive study of advanced water recovery and solid waste processing techniques employed in both aerospace and domestic or commercial applications is reported. A systems approach was used to synthesize a prototype system design of an advanced water treatment/waste processing system. Household water use characteristics were studied and modified through the use of low water use devices and a limited amount of water reuse. This modified household system was then used as a baseline system for development of several water treatment waste processing systems employing advanced techniques. A hybrid of these systems was next developed and a preliminary design was generated to define system and hardware functions.

  16. Is clinical cognition binary or continuous?

    PubMed

    Norman, Geoffrey; Monteiro, Sandra; Sherbino, Jonathan

    2013-08-01

    A dominant theory of clinical reasoning is the so-called "dual processing theory," in which the diagnostic process may proceed through a rapid, unconscious, intuitive process (System 1) or a slow, conceptual, analytical process (System 2). Diagnostic errors are thought to arise primarily from cognitive biases originating in System 1. In this issue, Custers points out that this model is unnecessarily restrictive and that it is more likely that diagnostic tasks may proceed through a variety of mental strategies ranging from "analytical" to "intuitive."The authors of this commentary agree that the notion that System 1 and System 2 processes are somehow in competition and will necessarily lead to different conclusions is unnecessarily restrictive. On the other hand, they argue that there is substantial evidence in support of a dual processing model, and that most objections to dual processing theory can be easily accommodated by simply presuming that both processes operate in concertand that solving any task may rely to varying degrees on both processes.

  17. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  18. Analyses of requirements for computer control and data processing experiment subsystems. Volume 2: ATM experiment S-056 image data processing system software development

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The IDAPS (Image Data Processing System) is a user-oriented, computer-based, language and control system, which provides a framework or standard for implementing image data processing applications, simplifies set-up of image processing runs so that the system may be used without a working knowledge of computer programming or operation, streamlines operation of the image processing facility, and allows multiple applications to be run in sequence without operator interaction. The control system loads the operators, interprets the input, constructs the necessary parameters for each application, and cells the application. The overlay feature of the IBSYS loader (IBLDR) provides the means of running multiple operators which would otherwise overflow core storage.

  19. Single-chip microcomputer for image processing in the photonic measuring system

    NASA Astrophysics Data System (ADS)

    Smoleva, Olga S.; Ljul, Natalia Y.

    2002-04-01

    The non-contact measuring system has been designed for rail- track parameters control on the Moscow Metro. It detects some significant parameters: rail-track width, rail-track height, gage, rail-slums, crosslevel, pickets, and car speed. The system consists of three subsystems: non-contact system of rail-track width, height, and gage inspection, non-contact system of rail-slums inspection and subsystem for crosslevel, speed, and pickets detection. Data from subsystems is transferred to pre-processing unit. In order to process data received from subsystems, the single-chip signal processor ADSP-2185 must be used due to providing required processing speed. After data will be processed, it is send to PC, which processes it and outputs it in the readable form.

  20. Applied digital signal processing systems for vortex flowmeter with digital signal processing.

    PubMed

    Xu, Ke-Jun; Zhu, Zhi-Hai; Zhou, Yang; Wang, Xiao-Fen; Liu, San-Shan; Huang, Yun-Zhi; Chen, Zhi-Yuan

    2009-02-01

    The spectral analysis is combined with digital filter to process the vortex sensor signal for reducing the effect of disturbance at low frequency from pipe vibrations and increasing the turndown ratio. Using digital signal processing chip, two kinds of digital signal processing systems are developed to implement these algorithms. One is an integrative system, and the other is a separated system. A limiting amplifier is designed in the input analog condition circuit to adapt large amplitude variation of sensor signal. Some technique measures are taken to improve the accuracy of the output pulse, speed up the response time of the meter, and reduce the fluctuation of the output signal. The experimental results demonstrate the validity of the digital signal processing systems.

  1. Annual Symposium on Machine Processing of Remotely Sensed Data, 4th, Purdue University, West Lafayette, Ind., June 21-23, 1977, Proceedings

    NASA Technical Reports Server (NTRS)

    Morrison, D. B. (Editor); Scherer, D. J.

    1977-01-01

    Papers are presented on a variety of techniques for the machine processing of remotely sensed data. Consideration is given to preprocessing methods such as the correction of Landsat data for the effects of haze, sun angle, and reflectance and to the maximum likelihood estimation of signature transformation algorithm. Several applications of machine processing to agriculture are identified. Various types of processing systems are discussed such as ground-data processing/support systems for sensor systems and the transfer of remotely sensed data to operational systems. The application of machine processing to hydrology, geology, and land-use mapping is outlined. Data analysis is considered with reference to several types of classification methods and systems.

  2. Applying a Qualitative Modeling Shell to Process Diagnosis: The Caster System.

    DTIC Science & Technology

    1986-03-01

    Process Diagnosis: The Caster System by Timothy F. Thompson and William J. Clancey Department of Computer Science Stanford University Stanford, CA 94303...MODELING SHELL TO PROCESS DIAGNOSIS: THE CASTER SYSTEM 12 PERSONAL AUTHOR(S) TIMOTHY F. THOMPSON. WESTINGHOUSE R&D CENTER, WILLIAM CLANCEY, STANFORD...editions are obsolete. Applying a Qualitative Modeling Shell to Process Diagnosis: The Caster System by Timothy F. Thompson, Westinghouse R&D Center

  3. Spitzer Telemetry Processing System

    NASA Technical Reports Server (NTRS)

    Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.

    2013-01-01

    The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.

  4. Design of signal reception and processing system of embedded ultrasonic endoscope

    NASA Astrophysics Data System (ADS)

    Li, Ming; Yu, Feng; Zhang, Ruiqiang; Li, Yan; Chen, Xiaodong; Yu, Daoyin

    2009-11-01

    Embedded Ultrasonic Endoscope, based on embedded microprocessor and embedded real-time operating system, sends a micro ultrasonic probe into coelom through the biopsy channel of the Electronic Endoscope to get the fault histology features of digestive organs by rotary scanning, and acquires the pictures of the alimentary canal mucosal surface. At the same time, ultrasonic signals are processed by signal reception and processing system, forming images of the full histology of the digestive organs. Signal Reception and Processing System is an important component of Embedded Ultrasonic Endoscope. However, the traditional design, using multi-level amplifiers and special digital processing circuits to implement signal reception and processing, is no longer satisfying the standards of high-performance, miniaturization and low power requirements that embedded system requires, and as a result of the high noise that multi-level amplifier brought, the extraction of small signal becomes hard. Therefore, this paper presents a method of signal reception and processing based on double variable gain amplifier and FPGA, increasing the flexibility and dynamic range of the Signal Reception and Processing System, improving system noise level, and reducing power consumption. Finally, we set up the embedded experiment system, using a transducer with the center frequency of 8MHz to scan membrane samples, and display the image of ultrasonic echo reflected by each layer of membrane, with a frame rate of 5Hz, verifying the correctness of the system.

  5. A Systems Approach to Nitrogen Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goins, Bobby

    A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less

  6. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers

  7. Real-time monitoring of clinical processes using complex event processing and transition systems.

    PubMed

    Meinecke, Sebastian

    2014-01-01

    Dependencies between tasks in clinical processes are often complex and error-prone. Our aim is to describe a new approach for the automatic derivation of clinical events identified via the behaviour of IT systems using Complex Event Processing. Furthermore we map these events on transition systems to monitor crucial clinical processes in real-time for preventing and detecting erroneous situations.

  8. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j) of...

  9. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j) of...

  10. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j) of...

  11. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to paragraph (j) of...

  12. 42 CFR 431.836 - Corrective action under the MQC claims processing assessment system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... assessment system. 431.836 Section 431.836 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.836 Corrective action under the MQC claims processing assessment system. The agency must— (a) Take action to...

  13. A Low Cost Microcomputer System for Process Dynamics and Control Simulations.

    ERIC Educational Resources Information Center

    Crowl, D. A.; Durisin, M. J.

    1983-01-01

    Discusses a video simulator microcomputer system used to provide real-time demonstrations to strengthen students' understanding of process dynamics and control. Also discusses hardware/software and simulations developed using the system. The four simulations model various configurations of a process liquid level tank system. (JN)

  14. NTP comparison process

    NASA Technical Reports Server (NTRS)

    Corban, Robert

    1993-01-01

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  15. THE COOLING REQUIREMENTS AND PROCESS SYSTEMS OF THE SOUTH AFRICAN RESEARCH REACTOR, SAFARI 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colley, J.R.

    1962-12-01

    The SAFARI 1 research reactor is cooled and moderated by light water. There are three process systems, a primary water system which cools the reactor core and surroundings, a pool water system, and a secondary water system which removes the heat from the primary and pool systems. The cooling requirements for the reactor core and experimental facilities are outlined, and the cooling and purification functions of the three process systems are described. (auth)

  16. USE OF INDICATOR ORGANISMS FOR DETERMINING PROCESS EFFECTIVENESS

    EPA Science Inventory

    Wastewaters, process effluents and treatment process residuals contain a variety of microorganisms. Many factors influence their densities as they move through collection systems and process equipment. Biological treatment systems rely on the catabolic processes of such microor...

  17. Modeling of outpatient prescribing process in iran: a gateway toward electronic prescribing system.

    PubMed

    Ahmadi, Maryam; Samadbeik, Mahnaz; Sadoughi, Farahnaz

    2014-01-01

    Implementation of electronic prescribing system can overcome many problems of the paper prescribing system, and provide numerous opportunities of more effective and advantageous prescribing. Successful implementation of such a system requires complete and deep understanding of work content, human force, and workflow of paper prescribing. The current study was designed in order to model the current business process of outpatient prescribing in Iran and clarify different actions during this process. In order to describe the prescribing process and the system features in Iran, the methodology of business process modeling and analysis was used in the present study. The results of the process documentation were analyzed using a conceptual model of workflow elements and the technique of modeling "As-Is" business processes. Analysis of the current (as-is) prescribing process demonstrated that Iran stood at the first levels of sophistication in graduated levels of electronic prescribing, namely electronic prescription reference, and that there were problematic areas including bottlenecks, redundant and duplicated work, concentration of decision nodes, and communicative weaknesses among stakeholders of the process. Using information technology in some activities of medication prescription in Iran has not eliminated the dependence of the stakeholders on paper-based documents and prescriptions. Therefore, it is necessary to implement proper system programming in order to support change management and solve the problems in the existing prescribing process. To this end, a suitable basis should be provided for reorganization and improvement of the prescribing process for the future electronic systems.

  18. Flat-plate solar array project: Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental Process System Development Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.

  19. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  20. Liquid and Gaseous Waste Operations Department annual operating report CY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddox, J.J.; Scott, C.B.

    1997-03-01

    This annual report summarizes operating activities dealing with the process waste system, the liquid low-level waste system, and the gaseous waste system. It also describes upgrade activities dealing with the process and liquid low-level waste systems, the cathodic protection system, a stack ventilation system, and configuration control. Maintenance activities are described dealing with nonradiological wastewater treatment plant, process waste treatment plant and collection system, liquid low-level waste system, and gaseous waste system. Miscellaneous activities include training, audits/reviews/tours, and environmental restoration support.

  1. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...

  2. 40 CFR 63.444 - Standards for the pulping system at sulfite processes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Standards for the pulping system at sulfite processes. (a) The owner or operator of each sulfite process... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Standards for the pulping system at sulfite processes. 63.444 Section 63.444 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  3. A Flexible Pilot-Scale Setup for Real-Time Studies in Process Systems Engineering

    ERIC Educational Resources Information Center

    Panjapornpon, Chanin; Fletcher, Nathan; Soroush, Masoud

    2006-01-01

    This manuscript describes a flexible, pilot-scale setup that can be used for training students and carrying out research in process systems engineering. The setup allows one to study a variety of process systems engineering concepts such as design feasibility, design flexibility, control configuration selection, parameter estimation, process and…

  4. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  5. Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory.

    PubMed

    Devine, Sean D

    2016-02-01

    Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's principle, where destabilising processes, operating under the second law of thermodynamics, change the information content or the algorithmic entropy of a system by ΔH bits, replication processes can access order, eject disorder, and counter the change without outside interventions. Both diversity in replicated structures, and the coupling of different replicated systems, increase the ability of the system (or systems) to self-regulate in a changing environment as adaptation processes select those structures that use resources more efficiently. At the level of the structure, as selection processes minimise the information loss, the irreversibility is minimised. While each structure that emerges can be said to be more entropically efficient, as such replicating structures proliferate, the dissipation of the system as a whole is higher than would be the case for inert or simpler structures. While a detailed application to most real systems would be difficult, the approach may well be useful in understanding incremental changes to real systems and provide broad descriptions of system behaviour. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  6. A global "imaging'' view on systems approaches in immunology.

    PubMed

    Ludewig, Burkhard; Stein, Jens V; Sharpe, James; Cervantes-Barragan, Luisa; Thiel, Volker; Bocharov, Gennady

    2012-12-01

    The immune system exhibits an enormous complexity. High throughput methods such as the "-omic'' technologies generate vast amounts of data that facilitate dissection of immunological processes at ever finer resolution. Using high-resolution data-driven systems analysis, causal relationships between complex molecular processes and particular immunological phenotypes can be constructed. However, processes in tissues, organs, and the organism itself (so-called higher level processes) also control and regulate the molecular (lower level) processes. Reverse systems engineering approaches, which focus on the examination of the structure, dynamics and control of the immune system, can help to understand the construction principles of the immune system. Such integrative mechanistic models can properly describe, explain, and predict the behavior of the immune system in health and disease by combining both higher and lower level processes. Moving from molecular and cellular levels to a multiscale systems understanding requires the development of methodologies that integrate data from different biological levels into multiscale mechanistic models. In particular, 3D imaging techniques and 4D modeling of the spatiotemporal dynamics of immune processes within lymphoid tissues are central for such integrative approaches. Both dynamic and global organ imaging technologies will be instrumental in facilitating comprehensive multiscale systems immunology analyses as discussed in this review. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Applications of massively parallel computers in telemetry processing

    NASA Technical Reports Server (NTRS)

    El-Ghazawi, Tarek A.; Pritchard, Jim; Knoble, Gordon

    1994-01-01

    Telemetry processing refers to the reconstruction of full resolution raw instrumentation data with artifacts, of space and ground recording and transmission, removed. Being the first processing phase of satellite data, this process is also referred to as level-zero processing. This study is aimed at investigating the use of massively parallel computing technology in providing level-zero processing to spaceflights that adhere to the recommendations of the Consultative Committee on Space Data Systems (CCSDS). The workload characteristics, of level-zero processing, are used to identify processing requirements in high-performance computing systems. An example of level-zero functions on a SIMD MPP, such as the MasPar, is discussed. The requirements in this paper are based in part on the Earth Observing System (EOS) Data and Operation System (EDOS).

  8. The Systems Engineering Process for Human Support Technology Development

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a system development project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the system development project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systems development. The purpose of advanced systems development is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.

  9. Test processing system (SEE)

    NASA Technical Reports Server (NTRS)

    Gaulene, P.

    1986-01-01

    The SEE data processing system, developed in 1985, manages and process test results. General information is provided on the SEE system: objectives, characteristics, basic principles, general organization, and operation. Full documentation is accessible by computer using the HELP SEE command.

  10. Coal conversion systems design and process modeling. Volume 1: Application of MPPR and Aspen computer models

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.

  11. Applications of High-speed motion analysis system on Solid Rocket Motor (SRM)

    NASA Astrophysics Data System (ADS)

    Liu, Yang; He, Guo-qiang; Li, Jiang; Liu, Pei-jin; Chen, Jian

    2007-01-01

    High-speed motion analysis system could record images up to 12,000fps and analyzed with the image processing system. The system stored data and images directly in electronic memory convenient for managing and analyzing. The high-speed motion analysis system and the X-ray radiography system were established the high-speed real-time X-ray radiography system, which could diagnose and measure the dynamic and high-speed process in opaque. The image processing software was developed for improve quality of the original image for acquiring more precise information. The typical applications of high-speed motion analysis system on solid rocket motor (SRM) were introduced in the paper. The research of anomalous combustion of solid propellant grain with defects, real-time measurement experiment of insulator eroding, explosion incision process of motor, structure and wave character of plume during the process of ignition and flameout, measurement of end burning of solid propellant, measurement of flame front and compatibility between airplane and missile during the missile launching were carried out using high-speed motion analysis system. The significative results were achieved through the research. Aim at application of high-speed motion analysis system on solid rocket motor, the key problem, such as motor vibrancy, electrical source instability, geometry aberrance, and yawp disturbance, which damaged the image quality, was solved. The image processing software was developed which improved the capability of measuring the characteristic of image. The experimental results showed that the system was a powerful facility to study instantaneous and high-speed process in solid rocket motor. With the development of the image processing technique, the capability of high-speed motion analysis system was enhanced.

  12. Ku-band signal design study. [space shuttle orbiter data processing network

    NASA Technical Reports Server (NTRS)

    Rubin, I.

    1978-01-01

    Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.

  13. MIUS wastewater technology evaluation

    NASA Technical Reports Server (NTRS)

    Poradek, J. C.

    1976-01-01

    A modular integrated utility system wastewater-treatment process is described. Research in the field of wastewater treatment is reviewed, treatment processes are specified and evaluated, and recommendations for system use are made. The treatment processes evaluated are in the broad categories of preparatory, primary, secondary, and tertiary treatment, physical-chemical processing, dissolved-solids removal, disinfection, sludge processing, and separate systems. Capital, operating, and maintenance costs are estimated, and extensive references are given.

  14. Introduction to Radar Signal and Data Processing: The Opportunity

    DTIC Science & Technology

    2006-09-01

    SpA) Director of Analysis of Integrated Systems Group Via Tiburtina Km. 12.400 00131 Rome ITALY e.mail: afarina@selex-si.com Key words: radar...signal processing, data processing, adaptivity, space-time adaptive processing, knowledge based systems , CFAR. 1. SUMMARY This paper introduces to...the lecture series dedicated to the knowledge-based radar signal and data processing. Knowledge-based expert system (KBS) is in the realm of

  15. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  16. Reading comprehension and working memory in learning-disabled readers: Is the phonological loop more important than the executive system?

    PubMed

    Swanson, H L

    1999-01-01

    This investigation explores the contribution of two working memory systems (the articulatory loop and the central executive) to the performance differences between learning-disabled (LD) and skilled readers. Performances of LD, chronological age (CA) matched, and reading level-matched children were compared on measures of phonological processing accuracy and speed (articulatory system), long-term memory (LTM) accuracy and speed, and executive processing. The results indicated that (a) LD readers were inferior on measures of articulatory, LTM, and executive processing; (b) LD readers were superior to RL readers on measures of executive processing, but were comparable to RL readers on measures of the articulatory and LTM system; (c) executive processing differences remained significant between LD and CA-matched children when measures of reading comprehension, articulatory processes, and LTM processes were partialed from the analysis; and (d) executive processing contributed significant variance to reading comprehension when measures of the articulatory and LTM systems were entered into a hierarchical regression model. In summary, LD readers experience constraints in the articulatory and LTM system, but constraints mediate only some of the influence of executive processing on reading comprehension. Further, LD readers suffer executive processing problems nonspecific to their reading comprehension problems. Copyright 1999 Academic Press.

  17. Putting the Power of Configuration in the Hands of the Users

    NASA Technical Reports Server (NTRS)

    Al-Shihabi, Mary-Jo; Brown, Mark; Rigolini, Marianne

    2011-01-01

    Goal was to reduce the overall cost of human space flight while maintaining the most demanding standards for safety and mission success. In support of this goal, a project team was chartered to replace 18 legacy Space Shuttle nonconformance processes and systems with one fully integrated system Problem Reporting and Corrective Action (PRACA) processes provide a closed-loop system for the identification, disposition, resolution, closure, and reporting of all Space Shuttle hardware/software problems PRACA processes are integrated throughout the Space Shuttle organizational processes and are critical to assuring a safe and successful program Primary Project Objectives Develop a fully integrated system that provides an automated workflow with electronic signatures Support multiple NASA programs and contracts with a single "system" architecture Define standard processes, implement best practices, and minimize process variations

  18. Providing security for automated process control systems at hydropower engineering facilities

    NASA Astrophysics Data System (ADS)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.

    2016-12-01

    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  19. The Interaction of Spacecraft Cabin Atmospheric Quality and Water Processing System Performance

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Croomes, Scott D. (Technical Monitor)

    2002-01-01

    Although designed to remove organic contaminants from a variety of waste water streams, the planned U.S.- and present Russian-provided water processing systems onboard the International Space Station (ISS) have capacity limits for some of the more common volatile cleaning solvents used for housekeeping purposes. Using large quantities of volatile cleaning solvents during the ground processing and in-flight operational phases of a crewed spacecraft such as the ISS can lead to significant challenges to the water processing systems. To understand the challenges facing the management of water processing capacity, the relationship between cabin atmospheric quality and humidity condensate loading is presented. This relationship is developed as a tool to determine the cabin atmospheric loading that may compromise water processing system performance. A comparison of cabin atmospheric loading with volatile cleaning solvents from ISS, Mir, and Shuttle are presented to predict acceptable limits to maintain optimal water processing system performance.

  20. Woods Hole Image Processing System Software implementation; using NetCDF as a software interface for image processing

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.

  1. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  2. 42 CFR 433.116 - FFP for operation of mechanized claims processing and information retrieval systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and information retrieval systems. 433.116 Section 433.116 Public Health CENTERS FOR MEDICARE... FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval Systems § 433.116 FFP for operation of mechanized claims processing and information retrieval systems. (a) Subject to 42 CFR 433.113(c...

  3. Dual Systems Competence [Image Omitted] Procedural Processing: A Relational Developmental Systems Approach to Reasoning

    ERIC Educational Resources Information Center

    Ricco, Robert B.; Overton, Willis F.

    2011-01-01

    Many current psychological models of reasoning minimize the role of deductive processes in human thought. In the present paper, we argue that deduction is an important part of ordinary cognition and we propose that a dual systems Competence [image omitted] Procedural processing model conceptualized within relational developmental systems theory…

  4. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  5. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  6. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  7. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  8. 49 CFR 232.503 - Process to introduce new brake system technology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Process to introduce new brake system technology... Technology § 232.503 Process to introduce new brake system technology. (a) Pursuant to the procedures... brake system technology, prior to implementing the plan. (b) Each railroad shall complete a pre-revenue...

  9. Automated processing of whole blood units: operational value and in vitro quality of final blood components

    PubMed Central

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    Background The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Materials and methods Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. Results The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. Discussion These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement. PMID:22044958

  10. Automated processing of whole blood units: operational value and in vitro quality of final blood components.

    PubMed

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.

  11. Software architecture for intelligent image processing using Prolog

    NASA Astrophysics Data System (ADS)

    Jones, Andrew C.; Batchelor, Bruce G.

    1994-10-01

    We describe a prototype system for interactive image processing using Prolog, implemented by the first author on an Apple Macintosh computer. This system is inspired by Prolog+, but differs from it in two particularly important respects. The first is that whereas Prolog+ assumes the availability of dedicated image processing hardware, with which the Prolog system communicates, our present system implements image processing functions in software using the C programming language. The second difference is that although our present system supports Prolog+ commands, these are implemented in terms of lower-level Prolog predicates which provide a more flexible approach to image manipulation. We discuss the impact of the Apple Macintosh operating system upon the implementation of the image-processing functions, and the interface between these functions and the Prolog system. We also explain how the Prolog+ commands have been implemented. The system described in this paper is a fairly early prototype, and we outline how we intend to develop the system, a task which is expedited by the extensible architecture we have implemented.

  12. VerifEYE: a real-time meat inspection system for the beef processing industry

    NASA Astrophysics Data System (ADS)

    Kocak, Donna M.; Caimi, Frank M.; Flick, Rick L.; Elharti, Abdelmoula

    2003-02-01

    Described is a real-time meat inspection system developed for the beef processing industry by eMerge Interactive. Designed to detect and localize trace amounts of contamination on cattle carcasses in the packing process, the system affords the beef industry an accurate, high speed, passive optical method of inspection. Using a method patented by United States Department of Agriculture and Iowa State University, the system takes advantage of fluorescing chlorophyll found in the animal's diet and therefore the digestive track to allow detection and imaging of contaminated areas that may harbor potentially dangerous microbial pathogens. Featuring real-time image processing and documentation of performance, the system can be easily integrated into a processing facility's Hazard Analysis and Critical Control Point quality assurance program. This paper describes the VerifEYE carcass inspection and removal verification system. Results indicating the feasibility of the method, as well as field data collected using a prototype system during four university trials conducted in 2001 are presented. Two successful demonstrations using the prototype system were held at a major U.S. meat processing facility in early 2002.

  13. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  14. RDD-100 and the systems engineering process

    NASA Technical Reports Server (NTRS)

    Averill, Robert D.

    1994-01-01

    An effective systems engineering approach applied through the project life cycle can help Langley produce a better product. This paper demonstrates how an enhanced systems engineering process for in-house flight projects assures that each system will achieve its goals with quality performance and within planned budgets and schedules. This paper also describes how the systems engineering process can be used in combination with available software tools.

  15. The Defense Systems Acquisition and Review Council

    DTIC Science & Technology

    1976-09-15

    THE DEFENSE SYSTEMS ACQUISITION AND REVIEW COUNCIL.. A Study of Areas of Consideration Affecting the Functions and Process of Defense Major...COUNfIL: 4 Study of Areas of Considerationi Affecting/he Functions and Process of Defense _- 1 Major Systems Acquisition. , ; O,. v AUTHOR(e) I. C...Studies DSARC -- Functions and Process OSDCAIG *, Army Systems . A - =A -- he Defense Systems Acquisition Review Council (DSARC) was created to assume

  16. Overview of the production of sintered SiC optics and optical sub-assemblies

    NASA Astrophysics Data System (ADS)

    Williams, S.; Deny, P.

    2005-08-01

    The following is an overview on sintered silicon carbide (SSiC) material properties and processing requirements for the manufacturing of components for advanced technology optical systems. The overview will compare SSiC material properties to typical materials used for optics and optical structures. In addition, it will review manufacturing processes required to produce optical components in detail by process step. The process overview will illustrate current manufacturing process and concepts to expand the process size capability. The overview will include information on the substantial capital equipment employed in the manufacturing of SSIC. This paper will also review common in-process inspection methodology and design rules. The design rules are used to improve production yield, minimize cost, and maximize the inherent benefits of SSiC for optical systems. Optimizing optical system designs for a SSiC manufacturing process will allow systems designers to utilize SSiC as a low risk, cost competitive, and fast cycle time technology for next generation optical systems.

  17. Field Artillery Ammunition Processing System (FAAPS) concept evaluation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kring, C.T.; Babcock, S.M.; Watkin, D.C.

    1992-06-01

    The Field Artillery Ammunition Processing System (FAAPS) is an initiative to introduce a palletized load system (PLS) that is transportable with an automated ammunition processing and storage system for use on the battlefield. System proponents have targeted a 20% increase in the ammunition processing rate over the current operation while simultaneously reducing the total number of assigned field artillery battalion personnel by 30. The overall objective of the FAAPS Project is the development and demonstration of an improved process to accomplish these goals. The initial phase of the FAAPS Project and the subject of this study is the FAAPS conceptmore » evaluation. The concept evaluation consists of (1) identifying assumptions and requirements, (2) documenting the process flow, (3) identifying and evaluating technologies available to accomplish the necessary ammunition processing and storage operations, and (4) presenting alternative concepts with associated costs, processing rates, and manpower requirements for accomplishing the operation. This study provides insight into the achievability of the desired objectives.« less

  18. Field Artillery Ammunition Processing System (FAAPS) concept evaluation study. Ammunition Logistics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kring, C.T.; Babcock, S.M.; Watkin, D.C.

    1992-06-01

    The Field Artillery Ammunition Processing System (FAAPS) is an initiative to introduce a palletized load system (PLS) that is transportable with an automated ammunition processing and storage system for use on the battlefield. System proponents have targeted a 20% increase in the ammunition processing rate over the current operation while simultaneously reducing the total number of assigned field artillery battalion personnel by 30. The overall objective of the FAAPS Project is the development and demonstration of an improved process to accomplish these goals. The initial phase of the FAAPS Project and the subject of this study is the FAAPS conceptmore » evaluation. The concept evaluation consists of (1) identifying assumptions and requirements, (2) documenting the process flow, (3) identifying and evaluating technologies available to accomplish the necessary ammunition processing and storage operations, and (4) presenting alternative concepts with associated costs, processing rates, and manpower requirements for accomplishing the operation. This study provides insight into the achievability of the desired objectives.« less

  19. Graphics Processing Unit (GPU) implementation of image processing algorithms to improve system performance of the Control, Acquisition, Processing, and Image Display System (CAPIDS) of the Micro-Angiographic Fluoroscope (MAF).

    PubMed

    Vasan, S N Swetadri; Ionita, Ciprian N; Titus, A H; Cartwright, A N; Bednarek, D R; Rudin, S

    2012-02-23

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame.

  20. Welding process modelling and control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  1. 42 CFR 433.112 - FFP for design, development, installation or enhancement of mechanized claims processing and...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... enhancement of mechanized claims processing and information retrieval systems. 433.112 Section 433.112 Public... processing and information retrieval systems. (a) Subject to paragraph (c) of this section, FFP is available... enhancement of a mechanized claims processing and information retrieval system only if the APD is approved by...

  2. 42 CFR 433.112 - FFP for design, development, installation or enhancement of mechanized claims processing and...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... enhancement of mechanized claims processing and information retrieval systems. 433.112 Section 433.112 Public... processing and information retrieval systems. (a) Subject to paragraph (c) of this section, FFP is available... enhancement of a mechanized claims processing and information retrieval system only if the APD is approved by...

  3. 42 CFR 433.112 - FFP for design, development, installation or enhancement of mechanized claims processing and...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... enhancement of mechanized claims processing and information retrieval systems. 433.112 Section 433.112 Public... processing and information retrieval systems. (a) Subject to paragraph (c) of this section, FFP is available... enhancement of a mechanized claims processing and information retrieval system only if the APD is approved by...

  4. 42 CFR 433.112 - FFP for design, development, installation or enhancement of mechanized claims processing and...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... enhancement of mechanized claims processing and information retrieval systems. 433.112 Section 433.112 Public... processing and information retrieval systems. (a) Subject to paragraph (c) of this section, FFP is available... enhancement of a mechanized claims processing and information retrieval system only if the APD is approved by...

  5. Autonomous Agents for Dynamic Process Planning in the Flexible Manufacturing System

    NASA Astrophysics Data System (ADS)

    Nik Nejad, Hossein Tehrani; Sugimura, Nobuhiro; Iwamura, Koji; Tanimizu, Yoshitaka

    Rapid changes of market demands and pressures of competition require manufacturers to maintain highly flexible manufacturing systems to cope with a complex manufacturing environment. This paper deals with development of an agent-based architecture of dynamic systems for incremental process planning in the manufacturing systems. In consideration of alternative manufacturing processes and machine tools, the process plans and the schedules of the manufacturing resources are generated incrementally and dynamically. A negotiation protocol is discussed, in this paper, to generate suitable process plans for the target products real-timely and dynamically, based on the alternative manufacturing processes. The alternative manufacturing processes are presented by the process plan networks discussed in the previous paper, and the suitable process plans are searched and generated to cope with both the dynamic changes of the product specifications and the disturbances of the manufacturing resources. We initiatively combine the heuristic search algorithms of the process plan networks with the negotiation protocols, in order to generate suitable process plans in the dynamic manufacturing environment.

  6. Feasibility study of an Integrated Program for Aerospace vehicle Design (IPAD). Volume 1A: Summary

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.; Redhed, D. D.; Kawaguchi, A. S.; Hansen, S. D.; Southall, J. W.

    1973-01-01

    IPAD was defined as a total system oriented to the product design process. This total system was designed to recognize the product design process, individuals and their design process tasks, and the computer-based IPAD System to aid product design. Principal elements of the IPAD System include the host computer and its interactive system software, new executive and data management software, and an open-ended IPAD library of technical programs to match the intended product design process. The basic goal of the IPAD total system is to increase the productivity of the product design organization. Increases in individual productivity were feasible through automation and computer support of routine information handling. Such proven automation can directly decrease cost and flowtime in the product design process.

  7. Spot restoration for GPR image post-processing

    DOEpatents

    Paglieroni, David W; Beer, N. Reginald

    2014-05-20

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  8. Thermal performance of a photographic laboratory process: Solar Hot Water System

    NASA Technical Reports Server (NTRS)

    Walker, J. A.; Jensen, R. N.

    1982-01-01

    The thermal performance of a solar process hot water system is described. The system was designed to supply 22,000 liters (5,500 gallons) per day of 66 C (150 F) process water for photographic processing. The 328 sq m (3,528 sq. ft.) solar field has supplied 58% of the thermal energy for the system. Techniques used for analyzing various thermal values are given. Load and performance factors and the resulting solar contribution are discussed.

  9. The process of development of a prioritization tool for a clinical decision support build within a computerized provider order entry system: Experiences from St Luke's Health System.

    PubMed

    Wolf, Matthew; Miller, Suzanne; DeJong, Doug; House, John A; Dirks, Carl; Beasley, Brent

    2016-09-01

    To establish a process for the development of a prioritization tool for a clinical decision support build within a computerized provider order entry system and concurrently to prioritize alerts for Saint Luke's Health System. The process of prioritizing clinical decision support alerts included (a) consensus sessions to establish a prioritization process and identify clinical decision support alerts through a modified Delphi process and (b) a clinical decision support survey to validate the results. All members of our health system's physician quality organization, Saint Luke's Care as well as clinicians, administrators, and pharmacy staff throughout Saint Luke's Health System, were invited to participate in this confidential survey. The consensus sessions yielded a prioritization process through alert contextualization and associated Likert-type scales. Utilizing this process, the clinical decision support survey polled the opinions of 850 clinicians with a 64.7 percent response rate. Three of the top rated alerts were approved for the pre-implementation build at Saint Luke's Health System: Acute Myocardial Infarction Core Measure Sets, Deep Vein Thrombosis Prophylaxis within 4 h, and Criteria for Sepsis. This study establishes a process for developing a prioritization tool for a clinical decision support build within a computerized provider order entry system that may be applicable to similar institutions. © The Author(s) 2015.

  10. The embodied embedded character of system 1 processing.

    PubMed

    Bellini-Leite, Samuel de Castro

    2013-01-01

    In the last thirty years, a relatively large group of cognitive scientists have begun characterising the mind in terms of two distinct, relatively autonomous systems. To account for paradoxes in empirical results of studies mainly on reasoning, Dual Process Theories were developed. Such Dual Process Theories generally agree that System 1 is rapid, automatic, parallel, and heuristic-based and System 2 is slow, capacity-demanding, sequential, and related to consciousness. While System 2 can still be decently understood from a traditional cognitivist approach, I will argue that it is essential for System 1 processing to be comprehended in an Embodied Embedded approach to Cognition.

  11. The Embodied Embedded Character of System 1 Processing

    PubMed Central

    Bellini-Leite, Samuel de Castro

    2013-01-01

    In the last thirty years, a relatively large group of cognitive scientists have begun characterising the mind in terms of two distinct, relatively autonomous systems. To account for paradoxes in empirical results of studies mainly on reasoning, Dual Process Theories were developed. Such Dual Process Theories generally agree that System 1 is rapid, automatic, parallel, and heuristic-based and System 2 is slow, capacity-demanding, sequential, and related to consciousness. While System 2 can still be decently understood from a traditional cognitivist approach, I will argue that it is essential for System 1 processing to be comprehended in an Embodied Embedded approach to Cognition. PMID:23678245

  12. ISO 9000 and/or Systems Engineering Capability Maturity Model?

    NASA Technical Reports Server (NTRS)

    Gholston, Sampson E.

    2002-01-01

    For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.

  13. Event (error and near-miss) reporting and learning system for process improvement in radiation oncology.

    PubMed

    Mutic, Sasa; Brame, R Scott; Oddiraju, Swetha; Parikh, Parag; Westfall, Melisa A; Hopkins, Merilee L; Medina, Angel D; Danieley, Jonathan C; Michalski, Jeff M; El Naqa, Issam M; Low, Daniel A; Wu, Bin

    2010-09-01

    The value of near-miss and error reporting processes in many industries is well appreciated and typically can be supported with data that have been collected over time. While it is generally accepted that such processes are important in the radiation therapy (RT) setting, studies analyzing the effects of organized reporting and process improvement systems on operation and patient safety in individual clinics remain scarce. The purpose of this work is to report on the design and long-term use of an electronic reporting system in a RT department and compare it to the paper-based reporting system it replaced. A specifically designed web-based system was designed for reporting of individual events in RT and clinically implemented in 2007. An event was defined as any occurrence that could have, or had, resulted in a deviation in the delivery of patient care. The aim of the system was to support process improvement in patient care and safety. The reporting tool was designed so individual events could be quickly and easily reported without disrupting clinical work. This was very important because the system use was voluntary. The spectrum of reported deviations extended from minor workflow issues (e.g., scheduling) to errors in treatment delivery. Reports were categorized based on functional area, type, and severity of an event. The events were processed and analyzed by a formal process improvement group that used the data and the statistics collected through the web-based tool for guidance in reengineering clinical processes. The reporting trends for the first 24 months with the electronic system were compared to the events that were reported in the same clinic with a paper-based system over a seven-year period. The reporting system and the process improvement structure resulted in increased event reporting, improved event communication, and improved identification of clinical areas which needed process and safety improvements. The reported data were also useful for the evaluation of corrective measures and recognition of ineffective measures and efforts. The electronic system was relatively well accepted by personnel and resulted in minimal disruption of clinical work. Event reporting in the quarters with the fewest number of reported events, though voluntary, was almost four times greater than the most events reported in any one quarter with the paper-based system and remained consistent from the inception of the process through the date of this report. However, the acceptance was not universal, validating the need for improved education regarding reporting processes and systematic approaches to reporting culture development. Specially designed electronic event reporting systems in a radiotherapy setting can provide valuable data for process and patient safety improvement and are more effective reporting mechanisms than paper-based systems. Additional work is needed to develop methods that can more effectively utilize reported data for process improvement, including the development of standardized event taxonomy and a classification system for RT.

  14. Health-care process improvement decisions: a systems perspective.

    PubMed

    Walley, Paul; Silvester, Kate; Mountford, Shaun

    2006-01-01

    The paper seeks to investigate decision-making processes within hospital improvement activity, to understand how performance measurement systems influence decisions and potentially lead to unsuccessful or unsustainable process changes. A longitudinal study over a 33-month period investigates key events, decisions and outcomes at one medium-sized hospital in the UK. Process improvement events are monitored using process control methods and by direct observation. The authors took a systems perspective of the health-care processes, ensuring that the impacts of decisions across the health-care supply chain were appropriately interpreted. The research uncovers the ways in which measurement systems disguise failed decisions and encourage managers to take a low-risk approach of "symptomatic relief" when trying to improve performance metrics. This prevents many managers from trying higher risk, sustainable process improvement changes. The behaviour of the health-care system is not understood by many managers and this leads to poor analysis of problem situations. Measurement using time-series methodologies, such as statistical process control are vital for a better understanding of the systems impact of changes. Senior managers must also be aware of the behavioural influence of similar performance measurement systems that discourage sustainable improvement. There is a risk that such experiences will tarnish the reputation of performance management as a discipline. Recommends process control measures as a way of creating an organization memory of how decisions affect performance--something that is currently lacking.

  15. Comparison of Centralized-Manual, Centralized-Computerized, and Decentralized-Computerized Order and Management Information Models for the Turkish Air Force Logistics System.

    DTIC Science & Technology

    1986-09-01

    differentiation between the systems. This study will investigate an appropriate Order Processing and Management Information System (OP&MIS) to link base-level...methodology: 1. Reviewed the current order processing and information model of the TUAF Logistics System. (centralized-manual model) 2. Described the...RDS program’s order processing and information system. (centralized-computerized model) 3. Described the order irocessing and information system of

  16. [Discussion on research thinking of traditional Chinese medicine standardization system based on whole process quality control].

    PubMed

    Dong, Ling; Sun, Yu; Pei, Wen-Xuan; Dai, Jun-Dong; Wang, Zi-Yu; Pan, Meng; Chen, Jiang-Peng; Wang, Yun

    2017-12-01

    The concept of "Quality by design" indicates that good design for the whole life cycle of pharmaceutical production enables the drug to meet the expected quality requirements. Aiming at the existing problems of the traditional Chinese medicine (TCM) industry, the TCM standardization system was put forward in this paper from the national strategic level, under the guidance by the idea of quality control in international manufacturing industry and with considerations of TCM industry's own characteristics and development status. The connotation of this strategy was to establish five interrelated systems: multi-indicators system based on tri-indicators system, quality standard and specification system of TCM herbal materials and decoction pieces, quality traceability system, data monitoring system based on whole-process quality control, and whole-process quality management system of TCM, and achieve the whole process systematic and scientific study in TCM industry through "top-level design-implement in steps-system integration" workflow. This article analyzed the correlation between the quality standards of all links, established standard operating procedures of each link and whole process, and constructed a high standard overall quality management system for TCM industry chains, in order to provide a demonstration for the establishment of TCM whole-process quality control system and provide systematic reference and basis for standardization strategy in TCM industry. Copyright© by the Chinese Pharmaceutical Association.

  17. Natural Resource Information System. Volume 1: Overall description

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.

  18. EOS MLS Science Data Processing System: A Description of Architecture and Capabilities

    NASA Technical Reports Server (NTRS)

    Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.

    2006-01-01

    This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.

  19. Efficient volatile metal removal from low rank coal in gasification, combustion, and processing systems and methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bland, Alan E.; Sellakumar, Kumar Muthusami; Newcomer, Jesse D.

    Efficient coal pre-processing systems (69) integrated with gasification, oxy-combustion, and power plant systems include a drying chamber (28), a volatile metal removal chamber (30), recirculated gases, including recycled carbon dioxide (21), nitrogen (6), and gaseous exhaust (60) for increasing the efficiencies and lowering emissions in various coal processing systems.

  20. Process for predicting structural performance of mechanical systems

    DOEpatents

    Gardner, David R.; Hendrickson, Bruce A.; Plimpton, Steven J.; Attaway, Stephen W.; Heinstein, Martin W.; Vaughan, Courtenay T.

    1998-01-01

    A process for predicting the structural performance of a mechanical system represents the mechanical system by a plurality of surface elements. The surface elements are grouped according to their location in the volume occupied by the mechanical system so that contacts between surface elements can be efficiently located. The process is well suited for efficient practice on multiprocessor computers.

  1. Application of advanced on-board processing concepts to future satellite communications systems

    NASA Technical Reports Server (NTRS)

    Katz, J. L.; Hoffman, M.; Kota, S. L.; Ruddy, J. M.; White, B. F.

    1979-01-01

    An initial definition of on-board processing requirements for an advanced satellite communications system to service domestic markets in the 1990's is presented. An exemplar system architecture with both RF on-board switching and demodulation/remodulation baseband processing was used to identify important issues related to system implementation, cost, and technology development.

  2. Information technology - a tool for development of the teaching process at the faculty of medicine, university of sarajevo.

    PubMed

    Masic, Izet; Begic, Edin

    2015-04-01

    Information Technologies, taking slow steps, have found its application in the teaching process of Faculty of Medicine, University of Sarajevo. Online availability of the teaching content is mainly intended for users of the Bologna process. The aim was to present the level of use of information technologies at the Faculty of Medicine, University of Sarajevo, comparing two systems, old system and the Bologna process, and to present new ways of improving the teaching process, using information technology. The study included the period from 2012 to 2014, and included 365 students from the old system and the Bologna Process. Study had prospective character. Students of the old system are older than students of the Bologna process. In both systems higher number of female students is significantly present. All students have their own computers, usually using the Office software package and web browsers. Visits of social networks were the most common reason for which they used computers. On question if they know to work with databases, 14.6% of students of the old system responded positively and 26.2% of students of the Bologna process answered the same. Students feel that working with databases is necessary to work in primary health care. On the question of the degree of computerization at the university, there were significant differences between the two systems (p <0.05). When asked about the possibility of using computers at school, there were no significant differences between the two systems. There has been progress of that opportunity from year to year. Students of Bologna process were more interested in the introduction of information technology, than students of old system. 68.7% of students of the Bologna process of generation 2013-2014, and 71.3% of generation 2014-2015, believed that the subject of Medical Informatics, the same or similar name, should be included in the new reform teaching process of the Faculty of Medicine, University of Sarajevo. Information technologies can help the development of the teaching process, and represent attractive and accessible tool in the process of modernization and progress.

  3. The automated system for technological process of spacecraft's waveguide paths soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Murygin, A. V.; Emilova, O. A.; Bocharov, A. N.; Laptenok, V. D.

    2016-11-01

    The paper solves the problem of automated process control of space vehicles waveguide paths soldering by means of induction heating. The peculiarities of the induction soldering process are analyzed and necessity of information-control system automation is identified. The developed automated system makes the control of the product heating process, by varying the power supplied to the inductor, on the basis of information about the soldering zone temperature, and stabilizing the temperature in a narrow range above the melting point of the solder but below the melting point of the waveguide. This allows the soldering process automating to improve the quality of the waveguides and eliminate burn-troughs. The article shows a block diagram of a software system consisting of five modules, and describes the main algorithm of its work. Also there is a description of the waveguide paths automated soldering system operation, for explaining the basic functions and limitations of the system. The developed software allows setting of the measurement equipment, setting and changing parameters of the soldering process, as well as view graphs of temperatures recorded by the system. There is shown the results of experimental studies that prove high quality of soldering process control and the system applicability to the tasks of automation.

  4. Paperless Procurement: The Impact of Advanced Automation

    DTIC Science & Technology

    1992-09-01

    System. POPS = Paperless Order Processing System; RADMIS = Research and Development Management Information System; SAACONS=Standard Army Automated... order processing system, which then updates the contractor’s production (or delivery) scheduling and contract accounting applications. In return, the...used by the DLA’s POPS. 3-5 into an EDI delivery order and pass it directly to the distributor’s or manufacturer’s order processing system. That

  5. Partitioning in parallel processing of production systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oflazer, K.

    1987-01-01

    This thesis presents research on certain issues related to parallel processing of production systems. It first presents a parallel production system interpreter that has been implemented on a four-processor multiprocessor. This parallel interpreter is based on Forgy's OPS5 interpreter and exploits production-level parallelism in production systems. Runs on the multiprocessor system indicate that it is possible to obtain speed-up of around 1.7 in the match computation for certain production systems when productions are split into three sets that are processed in parallel. The next issue addressed is that of partitioning a set of rules to processors in a parallel interpretermore » with production-level parallelism, and the extent of additional improvement in performance. The partitioning problem is formulated and an algorithm for approximate solutions is presented. The thesis next presents a parallel processing scheme for OPS5 production systems that allows some redundancy in the match computation. This redundancy enables the processing of a production to be divided into units of medium granularity each of which can be processed in parallel. Subsequently, a parallel processor architecture for implementing the parallel processing algorithm is presented.« less

  6. Transplant Image Processing Technology under Windows into the Platform Based on MiniGUI

    NASA Astrophysics Data System (ADS)

    Gan, Lan; Zhang, Xu; Lv, Wenya; Yu, Jia

    MFC has a large number of digital image processing-related API functions, object-oriented and class mechanisms which provides image processing technology strong support in Windows. But in embedded systems, image processing technology dues to the restrictions of hardware and software do not have the environment of MFC in Windows. Therefore, this paper draws on the experience of image processing technology of Windows and transplants it into MiniGUI embedded systems. The results show that MiniGUI/Embedded graphical user interface applications about image processing which used in embedded image processing system can be good results.

  7. ProcessGene-Connect: SOA Integration between Business Process Models and Enactment Transactions of Enterprise Software Systems

    NASA Astrophysics Data System (ADS)

    Wasser, Avi; Lincoln, Maya

    In recent years, both practitioners and applied researchers have become increasingly interested in methods for integrating business process models and enterprise software systems through the deployment of enabling middleware. Integrative BPM research has been mainly focusing on the conversion of workflow notations into enacted application procedures, and less effort has been invested in enhancing the connectivity between design level, non-workflow business process models and related enactment systems such as: ERP, SCM and CRM. This type of integration is useful at several stages of an IT system lifecycle, from design and implementation through change management, upgrades and rollout. The paper presents an integration method that utilizes SOA for connecting business process models with corresponding enterprise software systems. The method is then demonstrated through an Oracle E-Business Suite procurement process and its ERP transactions.

  8. System and process for production of magnesium metal and magnesium hydride from magnesium-containing salts and brines

    DOEpatents

    McGrail, Peter B.; Nune, Satish K.; Motkuri, Radha K.; Glezakou, Vassiliki-Alexandra; Koech, Phillip K.; Adint, Tyler T.; Fifield, Leonard S.; Fernandez, Carlos A.; Liu, Jian

    2016-11-22

    A system and process are disclosed for production of consolidated magnesium metal products and alloys with selected densities from magnesium-containing salts and feedstocks. The system and process employ a dialkyl magnesium compound that decomposes to produce the Mg metal product. Energy requirements and production costs are lower than for conventional processing.

  9. An interfaces approach to TES ground data system processing design with the Science Investigator-led Processing System (SIPS)

    NASA Technical Reports Server (NTRS)

    Kurian, R.; Grifin, A.

    2002-01-01

    Developing production-quality software to process the large volumes of scientific data is the responsibility of the TES Ground Data System, which is being developed at the Jet Propulsion Laboratory together with support contractor Raytheon/ITSS. The large data volume and processing requirements of the TES pose significant challenges to the design.

  10. Grinding Wheel System

    DOEpatents

    Malkin, Stephen; Gao, Robert; Guo, Changsheng; Varghese, Biju; Pathare, Sumukh

    2003-08-05

    A grinding wheel system includes a grinding wheel with at least one embedded sensor. The system also includes an adapter disk containing electronics that process signals produced by each embedded sensor and that transmits sensor information to a data processing platform for further processing of the transmitted information.

  11. Grinding Wheel System

    DOEpatents

    Malkin, Stephen; Gao, Robert; Guo, Changsheng; Varghese, Biju; Pathare, Sumukh

    2006-01-10

    A grinding wheel system includes a grinding wheel with at least one embedded sensor. The system also includes an adapter disk containing electronics that process signals produced by each embedded sensor and that transmits sensor information to a data processing platform for further processing of the transmitted information.

  12. 76 FR 16277 - System Restoration Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ... system restoration process. The Commission also approves the NERC's proposal to retire four existing EOP... prepare personnel to enable effective coordination of the system restoration process. The Commission also..., through the Reliability Standard development process, a modification to EOP-005-1 that identifies time...

  13. Reconceptualizing Learning as a Dynamical System.

    ERIC Educational Resources Information Center

    Ennis, Catherine D.

    1992-01-01

    Dynamical systems theory can increase our understanding of the constantly evolving learning process. Current research using experimental and interpretive paradigms focuses on describing the attractors and constraints stabilizing the educational process. Dynamical systems theory focuses attention on critical junctures in the learning process as…

  14. Information Processing Capacity of Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-07-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory.

  15. Information Processing Capacity of Dynamical Systems

    PubMed Central

    Dambre, Joni; Verstraeten, David; Schrauwen, Benjamin; Massar, Serge

    2012-01-01

    Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system's short-term memory. PMID:22816038

  16. Advanced Manufacturing Systems in Food Processing and Packaging Industry

    NASA Astrophysics Data System (ADS)

    Shafie Sani, Mohd; Aziz, Faieza Abdul

    2013-06-01

    In this paper, several advanced manufacturing systems in food processing and packaging industry are reviewed, including: biodegradable smart packaging and Nano composites, advanced automation control system consists of fieldbus technology, distributed control system and food safety inspection features. The main purpose of current technology in food processing and packaging industry is discussed due to major concern on efficiency of the plant process, productivity, quality, as well as safety. These application were chosen because they are robust, flexible, reconfigurable, preserve the quality of the food, and efficient.

  17. The application of digital techniques to the analysis of metallurgical experiments

    NASA Technical Reports Server (NTRS)

    Rathz, T. J.

    1977-01-01

    The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.

  18. Architecture Of High Speed Image Processing System

    NASA Astrophysics Data System (ADS)

    Konishi, Toshio; Hayashi, Hiroshi; Ohki, Tohru

    1988-01-01

    One of architectures for a high speed image processing system which corresponds to a new algorithm for a shape understanding is proposed. And the hardware system which is based on the archtecture was developed. Consideration points of the architecture are mainly that using processors should match with the processing sequence of the target image and that the developed system should be used practically in an industry. As the result, it was possible to perform each processing at a speed of 80 nano-seconds a pixel.

  19. A Process Management System for Networked Manufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Tingting; Wang, Huifen; Liu, Linyan

    With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.

  20. IMAGES: An interactive image processing system

    NASA Technical Reports Server (NTRS)

    Jensen, J. R.

    1981-01-01

    The IMAGES interactive image processing system was created specifically for undergraduate remote sensing education in geography. The system is interactive, relatively inexpensive to operate, almost hardware independent, and responsive to numerous users at one time in a time-sharing mode. Most important, it provides a medium whereby theoretical remote sensing principles discussed in lecture may be reinforced in laboratory as students perform computer-assisted image processing. In addition to its use in academic and short course environments, the system has also been used extensively to conduct basic image processing research. The flow of information through the system is discussed including an overview of the programs.

  1. Lateral position detection and control for friction stir systems

    DOEpatents

    Fleming, Paul; Lammlein, David; Cook, George E.; Wilkes, Don Mitchell; Strauss, Alvin M.; Delapp, David; Hartman, Daniel A.

    2010-12-14

    A friction stir system for processing at least a first workpiece includes a spindle actuator coupled to a rotary tool comprising a rotating member for contacting and processing the first workpiece. A detection system is provided for obtaining information related to a lateral alignment of the rotating member. The detection system comprises at least one sensor for measuring a force experienced by the rotary tool or a parameter related to the force experienced by the rotary tool during processing, wherein the sensor provides sensor signals. A signal processing system is coupled to receive and analyze the sensor signals and determine a lateral alignment of the rotating member relative to a selected lateral position, a selected path, or a direction to decrease a lateral distance relative to the selected lateral position or selected path. In one embodiment, the friction stir system can be embodied as a closed loop tracking system, such as a robot-based tracked friction stir welding (FSW) or friction stir processing (FSP) system.

  2. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Optical signal-processing systems based on anisotropic media

    NASA Astrophysics Data System (ADS)

    Kiyashko, B. V.

    1995-10-01

    Partially coherent optical systems for signal processing are considered. The transfer functions are formed in these systems by interference of polarised light transmitted by an anisotropic medium. It is shown that such systems can perform various integral transformations of both optical and electric signals, in particular, two-dimensional Fourier and Fresnel transformations, as well as spectral analysis of weak light sources. It is demonstrated that such systems have the highest luminosity and vibration immunity among the systems with interference formation of transfer functions. An experimental investigation is reported of the application of these systems in the processing of signals from a linear hydroacoustic antenna array, and in measurements of the optical spectrum and of the intrinsic noise.

  3. Integrated Main Propulsion System Performance Reconstruction Process/Models

    NASA Technical Reports Server (NTRS)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  4. Development of process control capability through the Browns Ferry Integrated Computer System using Reactor Water Clanup System as an example. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, J.; Mowrey, J.

    1995-12-01

    This report describes the design, development and testing of process controls for selected system operations in the Browns Ferry Nuclear Plant (BFNP) Reactor Water Cleanup System (RWCU) using a Computer Simulation Platform which simulates the RWCU System and the BFNP Integrated Computer System (ICS). This system was designed to demonstrate the feasibility of the soft control (video touch screen) of nuclear plant systems through an operator console. The BFNP Integrated Computer System, which has recently. been installed at BFNP Unit 2, was simulated to allow for operator control functions of the modeled RWCU system. The BFNP Unit 2 RWCU systemmore » was simulated using the RELAP5 Thermal/Hydraulic Simulation Model, which provided the steady-state and transient RWCU process variables and simulated the response of the system to control system inputs. Descriptions of the hardware and software developed are also included in this report. The testing and acceptance program and results are also detailed in this report. A discussion of potential installation of an actual RWCU process control system in BFNP Unit 2 is included. Finally, this report contains a section on industry issues associated with installation of process control systems in nuclear power plants.« less

  5. Workflow management systems in radiology

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim

    1998-07-01

    In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.

  6. An architecture for heuristic control of real-time processes

    NASA Technical Reports Server (NTRS)

    Raulefs, P.; Thorndyke, P. W.

    1987-01-01

    Abstract Process management combines complementary approaches of heuristic reasoning and analytical process control. Management of a continuous process requires monitoring the environment and the controlled system, assessing the ongoing situation, developing and revising planned actions, and controlling the execution of the actions. For knowledge-intensive domains, process management entails the potentially time-stressed cooperation among a variety of expert systems. By redesigning a blackboard control architecture in an object-oriented framework, researchers obtain an approach to process management that considerably extends blackboard control mechanisms and overcomes limitations of blackboard systems.

  7. Two improved coherent optical feedback systems for optical information processing

    NASA Technical Reports Server (NTRS)

    Lee, S. H.; Bartholomew, B.; Cederquist, J.

    1976-01-01

    Coherent optical feedback systems are Fabry-Perot interferometers modified to perform optical information processing. Two new systems based on plane parallel and confocal Fabry-Perot interferometers are introduced. The plane parallel system can be used for contrast control, intensity level selection, and image thresholding. The confocal system can be used for image restoration and solving partial differential equations. These devices are simpler and less expensive than previous systems. Experimental results are presented to demonstrate their potential for optical information processing.

  8. Low-cost Landsat digital processing system for state and local information systems

    NASA Technical Reports Server (NTRS)

    Hooper, N. J.; Spann, G. W.; Faust, N. L.; Paludan, C. T. N.

    1979-01-01

    The paper details a minicomputer-based system which is well within the budget of many state, regional, and local agencies that previously could not afford digital processing capability. In order to achieve this goal a workable small-scale Landsat system is examined to provide low-cost automated processing. It is anticipated that the alternative systems will be based on a single minicomputer, but that the peripherals will vary depending on the capability emphasized in a particular system.

  9. Graphical Language for Data Processing

    NASA Technical Reports Server (NTRS)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  10. Radar signal pre-processing to suppress surface bounce and multipath

    DOEpatents

    Paglieroni, David W; Mast, Jeffrey E; Beer, N. Reginald

    2013-12-31

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes that return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  11. Analysis of exergy efficiency of a super-critical compressed carbon dioxide energy-storage system based on the orthogonal method.

    PubMed

    He, Qing; Hao, Yinping; Liu, Hui; Liu, Wenyi

    2018-01-01

    Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system.

  12. Spatially assisted down-track median filter for GPR image post-processing

    DOEpatents

    Paglieroni, David W; Beer, N Reginald

    2014-10-07

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  13. A model system for targeted drug release triggered by biomolecular signals logically processed through enzyme logic networks.

    PubMed

    Mailloux, Shay; Halámek, Jan; Katz, Evgeny

    2014-03-07

    A new Sense-and-Act system was realized by the integration of a biocomputing system, performing analytical processes, with a signal-responsive electrode. A drug-mimicking release process was triggered by biomolecular signals processed by different logic networks, including three concatenated AND logic gates or a 3-input OR logic gate. Biocatalytically produced NADH, controlled by various combinations of input signals, was used to activate the electrochemical system. A biocatalytic electrode associated with signal-processing "biocomputing" systems was electrically connected to another electrode coated with a polymer film, which was dissolved upon the formation of negative potential releasing entrapped drug-mimicking species, an enzyme-antibody conjugate, operating as a model for targeted immune-delivery and consequent "prodrug" activation. The system offers great versatility for future applications in controlled drug release and personalized medicine.

  14. DMD: a digital light processing application to projection displays

    NASA Astrophysics Data System (ADS)

    Feather, Gary A.

    1989-01-01

    Summary Revolutionary technologies achieve rapid product and subsequent business diffusion only when the in- ventors focus on technology application, maturation, and proliferation. A revolutionary technology is emerg- ing with micro-electromechanical systems (MEMS). MEMS are being developed by leveraging mature semi- conductor processing coupled with mechanical systems into complete, integrated, useful systems. The digital micromirror device (DMD), a Texas Instruments invented MEMS, has focused on its application to projec- tion displays. The DMD has demonstrated its application as a digital light processor, processing and produc- ing compelling computer and video projection displays. This tutorial discusses requirements in the projection display market and the potential solutions offered by this digital light processing system. The seminar in- cludes an evaluation of the market, system needs, design, fabrication, application, and performance results of a system using digital light processing solutions.

  15. Analysis of exergy efficiency of a super-critical compressed carbon dioxide energy-storage system based on the orthogonal method

    PubMed Central

    He, Qing; Liu, Hui; Liu, Wenyi

    2018-01-01

    Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system. PMID:29634742

  16. Fabrication Process for Large Size Mold and Alignment Method for Nanoimprint System

    NASA Astrophysics Data System (ADS)

    Ishibashi, Kentaro; Kokubo, Mitsunori; Goto, Hiroshi; Mizuno, Jun; Shoji, Shuichi

    Nanoimprint technology is considered one of the mass production methods of the display for cellular phone or notebook computer, with Anti-Reflection Structures (ARS) pattern and so on. In this case, the large size mold with nanometer order pattern is very important. Then, we describe the fabrication process for large size mold, and the alignment method for UV nanoimprint system. We developed the original mold fabrication process using nanoimprint method and etching techniques. In 66 × 45 mm2 area, 200nm period seamless patterns were formed using this process. And, we constructed original alignment system that consists of the CCD-camera system, X-Y-θ table, method of moiré fringe, and image processing system, because the accuracy of pattern connection depends on the alignment method. This alignment system accuracy was within 20nm.

  17. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  18. WASTE TREATMENT BUILDING SYSTEM DESCRIPTION DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    F. Habashi

    2000-06-22

    The Waste Treatment Building System provides the space, layout, structures, and embedded subsystems that support the processing of low-level liquid and solid radioactive waste generated within the Monitored Geologic Repository (MGR). The activities conducted in the Waste Treatment Building include sorting, volume reduction, and packaging of dry waste, and collecting, processing, solidification, and packaging of liquid waste. The Waste Treatment Building System is located on the surface within the protected area of the MGR. The Waste Treatment Building System helps maintain a suitable environment for the waste processing and protects the systems within the Waste Treatment Building (WTB) from mostmore » of the natural and induced environments. The WTB also confines contaminants and provides radiological protection to personnel. In addition to the waste processing operations, the Waste Treatment Building System provides space and layout for staging of packaged waste for shipment, industrial and radiological safety systems, control and monitoring of operations, safeguards and security systems, and fire protection, ventilation and utilities systems. The Waste Treatment Building System also provides the required space and layout for maintenance activities, tool storage, and administrative facilities. The Waste Treatment Building System integrates waste processing systems within its protective structure to support the throughput rates established for the MGR. The Waste Treatment Building System also provides shielding, layout, and other design features to help limit personnel radiation exposures to levels which are as low as is reasonably achievable (ALARA). The Waste Treatment Building System interfaces with the Site Generated Radiological Waste Handling System, and with other MGR systems that support the waste processing operations. The Waste Treatment Building System interfaces with the General Site Transportation System, Site Communications System, Site Water System, MGR Site Layout, Safeguards and Security System, Site Radiological Monitoring System, Site Electrical Power System, Site Compressed Air System, and Waste Treatment Building Ventilation System.« less

  19. Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems

    NASA Technical Reports Server (NTRS)

    Allada, Rama Kumar; Lange, Kevin; Anderson, Molly

    2011-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA) that were developed using the Aspen Custom Modeler and Aspen Plus process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  20. In-situ materials processing systems and bioregenerative life support systems interrelationships

    NASA Technical Reports Server (NTRS)

    Mignon, George V.; Frye, Robert J.

    1992-01-01

    The synergy and linkages between bioregenerative life support systems and the materials produced by in-situ materials processing systems was investigated. Such systems produce a broad spectrum of byproducts such as oxygen, hydrogen, processed soil material, ceramics, refractory, and other materials. Some of these materials may be utilized by bioregenerative systems either directly or with minor modifications. The main focus of this project was to investigate how these materials can be utilized to assist a bioregenerative life support system. Clearly the need to provide a sustainable bioregenerative life support system for long term human habitation of space is significant.

  1. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  2. Implementation of Integrated System Fault Management Capability

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Schmalzel, John; Morris, Jon; Smith, Harvey; Turowski, Mark

    2008-01-01

    Fault Management to support rocket engine test mission with highly reliable and accurate measurements; while improving availability and lifecycle costs. CORE ELEMENTS: Architecture, taxonomy, and ontology (ATO) for DIaK management. Intelligent Sensor Processes; Intelligent Element Processes; Intelligent Controllers; Intelligent Subsystem Processes; Intelligent System Processes; Intelligent Component Processes.

  3. Modeling and Advanced Control for Sustainable Process Systems

    EPA Science Inventory

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-insp...

  4. Process for predicting structural performance of mechanical systems

    DOEpatents

    Gardner, D.R.; Hendrickson, B.A.; Plimpton, S.J.; Attaway, S.W.; Heinstein, M.W.; Vaughan, C.T.

    1998-05-19

    A process for predicting the structural performance of a mechanical system represents the mechanical system by a plurality of surface elements. The surface elements are grouped according to their location in the volume occupied by the mechanical system so that contacts between surface elements can be efficiently located. The process is well suited for efficient practice on multiprocessor computers. 12 figs.

  5. A Systems Engineering Framework for Implementing a Security and Critical Patch Management Process in Diverse Environments (Academic Departments' Workstations)

    ERIC Educational Resources Information Center

    Mohammadi, Hadi

    2014-01-01

    Use of the Patch Vulnerability Management (PVM) process should be seriously considered for any networked computing system. The PVM process prevents the operating system (OS) and software applications from being attacked due to security vulnerabilities, which lead to system failures and critical data leakage. The purpose of this research is to…

  6. Self-conscious robotic system design process--from analysis to implementation.

    PubMed

    Chella, Antonio; Cossentino, Massimo; Seidita, Valeria

    2011-01-01

    Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.

  7. Space vehicle electrical power processing distribution and control study. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Krausz, A.

    1972-01-01

    A concept for the processing, distribution, and control of electric power for manned space vehicles and future aircraft is presented. Emphasis is placed on the requirements of the space station and space shuttle configurations. The systems involved are referred to as the processing distribution and control system (PDCS), electrical power system (EPS), and electric power generation system (EPGS).

  8. Business Process Aware IS Change Management in SMEs

    NASA Astrophysics Data System (ADS)

    Makna, Janis

    Changes in the business process usually require changes in the computer supported information system and, vice versa, changes in the information system almost always cause at least some changes in the business process. In many situations it is not even possible to detect which of those changes are causes and which of them are effects. Nevertheless, it is possible to identify a set of changes that usually happen when one of the elements of the set changes its state. These sets of changes may be used as patterns for situation analysis to anticipate full range of activities to be performed to get the business process and/or information system back to the stable state after it is lost because of the changes in one of the elements. Knowledge about the change pattern gives an opportunity to manage changes of information systems even if business process models and information systems architecture are not neatly documented as is the case in many SMEs. Using change patterns it is possible to know whether changes in information systems are to be expected and how changes in information systems activities, data and users will impact different aspects of the business process supported by the information system.

  9. A web-based computer aided system for liver surgery planning: initial implementation on RayPlus

    NASA Astrophysics Data System (ADS)

    Luo, Ming; Yuan, Rong; Sun, Zhi; Li, Tianhong; Xie, Qingguo

    2016-03-01

    At present, computer aided systems for liver surgery design and risk evaluation are widely used in clinical all over the world. However, most systems are local applications that run on high-performance workstations, and the images have to processed offline. Compared with local applications, a web-based system is accessible anywhere and for a range of regardless of relative processing power or operating system. RayPlus (http://rayplus.life.hust.edu.cn), a B/S platform for medical image processing, was developed to give a jump start on web-based medical image processing. In this paper, we implement a computer aided system for liver surgery planning on the architecture of RayPlus. The system consists of a series of processing to CT images including filtering, segmentation, visualization and analyzing. Each processing is packaged into an executable program and runs on the server side. CT images in DICOM format are processed step by to interactive modeling on browser with zero-installation and server-side computing. The system supports users to semi-automatically segment the liver, intrahepatic vessel and tumor from the pre-processed images. Then, surface and volume models are built to analyze the vessel structure and the relative position between adjacent organs. The results show that the initial implementation meets satisfactorily its first-order objectives and provide an accurate 3D delineation of the liver anatomy. Vessel labeling and resection simulation are planned to add in the future. The system is available on Internet at the link mentioned above and an open username for testing is offered.

  10. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  11. Monitoring Satellite Data Ingest and Processing for the Atmosphere Science Investigator-led Processing Systems (SIPS)

    NASA Astrophysics Data System (ADS)

    Witt, J.; Gumley, L.; Braun, J.; Dutcher, S.; Flynn, B.

    2017-12-01

    The Atmosphere SIPS (Science Investigator-led Processing Systems) team at the Space Science and Engineering Center (SSEC), which is funded through a NASA contract, creates Level 2 cloud and aerosol products from the VIIRS instrument aboard the S-NPP satellite. In order to monitor the ingest and processing of files, we have developed an extensive monitoring system to observe every step in the process. The status grid is used for real time monitoring, and shows the current state of the system, including what files we have and whether or not we are meeting our latency requirements. Our snapshot tool displays the state of the system in the past. It displays which files were available at a given hour and is used for historical and backtracking purposes. In addition to these grid like tools we have created histograms and other statistical graphs for tracking processing and ingest metrics, such as total processing time, job queue time, and latency statistics.

  12. Manufacturing process and material selection in concurrent collaborative design of MEMS devices

    NASA Astrophysics Data System (ADS)

    Zha, Xuan F.; Du, H.

    2003-09-01

    In this paper we present knowledge of an intensive approach and system for selecting suitable manufacturing processes and materials for microelectromechanical systems (MEMS) devices in concurrent collaborative design environment. In the paper, fundamental issues on MEMS manufacturing process and material selection such as concurrent design framework, manufacturing process and material hierarchies, and selection strategy are first addressed. Then, a fuzzy decision support scheme for a multi-criteria decision-making problem is proposed for estimating, ranking and selecting possible manufacturing processes, materials and their combinations. A Web-based prototype advisory system for the MEMS manufacturing process and material selection, WebMEMS-MASS, is developed based on the client-knowledge server architecture and framework to help the designer find good processes and materials for MEMS devices. The system, as one of the important parts of an advanced simulation and modeling tool for MEMS design, is a concept level process and material selection tool, which can be used as a standalone application or a Java applet via the Web. The running sessions of the system are inter-linked with webpages of tutorials and reference pages to explain the facets, fabrication processes and material choices, and calculations and reasoning in selection are performed using process capability and material property data from a remote Web-based database and interactive knowledge base that can be maintained and updated via the Internet. The use of the developed system including operation scenario, use support, and integration with an MEMS collaborative design system is presented. Finally, an illustration example is provided.

  13. Animal models and conserved processes

    PubMed Central

    2012-01-01

    Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response being studied is located at higher levels of organization, is in a different module, or is influenced by other modules. However, when the examination of the conserved process occurs at the same level of organization or in the same module, and hence is subject to study solely by reductionism, then extrapolation is possible. PMID:22963674

  14. An array processing system for lunar geochemical and geophysical data

    NASA Technical Reports Server (NTRS)

    Eliason, E. M.; Soderblom, L. A.

    1977-01-01

    A computerized array processing system has been developed to reduce, analyze, display, and correlate a large number of orbital and earth-based geochemical, geophysical, and geological measurements of the moon on a global scale. The system supports the activities of a consortium of about 30 lunar scientists involved in data synthesis studies. The system was modeled after standard digital image-processing techniques but differs in that processing is performed with floating point precision rather than integer precision. Because of flexibility in floating-point image processing, a series of techniques that are impossible or cumbersome in conventional integer processing were developed to perform optimum interpolation and smoothing of data. Recently color maps of about 25 lunar geophysical and geochemical variables have been generated.

  15. Evaluation and comparison of alternative designs for water/solid-waste processing systems for spacecraft

    NASA Technical Reports Server (NTRS)

    Spurlock, J. M.

    1975-01-01

    Promising candidate designs currently being considered for the management of spacecraft solid waste and waste-water materials were assessed. The candidate processes were: (1) the radioisotope thermal energy evaporation/incinerator process; (2) the dry incineration process; and (3) the wet oxidation process. The types of spacecraft waste materials that were included in the base-line computational input to the candidate systems were feces, urine residues, trash and waste-water concentrates. The performance characteristics and system requirements for each candidate process to handle this input and produce the specified acceptable output (i.e., potable water, a storable dry ash, and vapor phase products that can be handled by a spacecraft atmosphere control system) were estimated and compared. Recommendations are presented.

  16. Application of Business Process Management to drive the deployment of a speech recognition system in a healthcare organization.

    PubMed

    González Sánchez, María José; Framiñán Torres, José Manuel; Parra Calderón, Carlos Luis; Del Río Ortega, Juan Antonio; Vigil Martín, Eduardo; Nieto Cervera, Jaime

    2008-01-01

    We present a methodology based on Business Process Management to guide the development of a speech recognition system in a hospital in Spain. The methodology eases the deployment of the system by 1) involving the clinical staff in the process, 2) providing the IT professionals with a description of the process and its requirements, 3) assessing advantages and disadvantages of the speech recognition system, as well as its impact in the organisation, and 4) help reorganising the healthcare process before implementing the new technology in order to identify how it can better contribute to the overall objective of the organisation.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goins, Bobby

    A systems based approach will be used to evaluate the nitrogen delivery process. This approach involves principles found in Lean, Reliability, Systems Thinking, and Requirements. This unique combination of principles and thought process yields a very in depth look into the system to which it is applied. By applying a systems based approach to the nitrogen delivery process there should be improvements in cycle time, efficiency, and a reduction in the required number of personnel needed to sustain the delivery process. This will in turn reduce the amount of demurrage charges that the site incurs. In addition there should bemore » less frustration associated with the delivery process.« less

  18. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  19. Energy-efficient hierarchical processing in the network of wireless intelligent sensors (WISE)

    NASA Astrophysics Data System (ADS)

    Raskovic, Dejan

    Sensor network nodes have benefited from technological advances in the field of wireless communication, processing, and power sources. However, the processing power of microcontrollers is often not sufficient to perform sophisticated processing, while the power requirements of digital signal processing boards or handheld computers are usually too demanding for prolonged system use. We are matching the intrinsic hierarchical nature of many digital signal-processing applications with the natural hierarchy in distributed wireless networks, and building the hierarchical system of wireless intelligent sensors. Our goal is to build a system that will exploit the hierarchical organization to optimize the power consumption and extend battery life for the given time and memory constraints, while providing real-time processing of sensor signals. In addition, we are designing our system to be able to adapt to the current state of the environment, by dynamically changing the algorithm through procedure replacement. This dissertation presents the analysis of hierarchical environment and methods for energy profiling used to evaluate different system design strategies, and to optimize time-effective and energy-efficient processing.

  20. 42 CFR 433.138 - Identifying liable third parties.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... processing and information retrieval system. Basic requirement—Development of an action plan. (1) If a State has a mechanized claims processing and information retrieval system approved by CMS under subpart C of... plan must be integrated with the mechanized claims processing and information retrieval system. (2) The...

  1. 42 CFR 433.138 - Identifying liable third parties.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... processing and information retrieval system. Basic requirement—Development of an action plan. (1) If a State has a mechanized claims processing and information retrieval system approved by CMS under subpart C of... plan must be integrated with the mechanized claims processing and information retrieval system. (2) The...

  2. 42 CFR 433.138 - Identifying liable third parties.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... processing and information retrieval system. Basic requirement—Development of an action plan. (1) If a State has a mechanized claims processing and information retrieval system approved by CMS under subpart C of... plan must be integrated with the mechanized claims processing and information retrieval system. (2) The...

  3. 40 CFR 420.15 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...

  4. 40 CFR 420.15 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...

  5. 40 CFR 420.15 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...

  6. 40 CFR 420.15 - Pretreatment standards for existing sources (PSES).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., shall be provided for process wastewaters from wet coke oven gas desulfurization systems, but only to... process wastewaters from other wet air pollution control systems (except those from coal charging and coke pushing emission controls), coal tar processing operations and coke plant groundwater remediation systems...

  7. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    NASA Astrophysics Data System (ADS)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  8. Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules

    NASA Astrophysics Data System (ADS)

    Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix

    2009-02-01

    Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.

  9. Caltrans WeatherShare Phase II System: An Application of Systems and Software Engineering Process to Project Development

    DOT National Transportation Integrated Search

    2009-08-25

    In cooperation with the California Department of Transportation, Montana State University's Western Transportation Institute has developed the WeatherShare Phase II system by applying Systems Engineering and Software Engineering processes. The system...

  10. Quality Function Deployment for Large Systems

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1992-01-01

    Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.

  11. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  12. Customer-centered careflow modeling based on guidelines.

    PubMed

    Huang, Biqing; Zhu, Peng; Wu, Cheng

    2012-10-01

    In contemporary society, customer-centered health care, which stresses customer participation and long-term tailored care, is inevitably becoming a trend. Compared with the hospital or physician-centered healthcare process, the customer-centered healthcare process requires more knowledge and modeling such a process is extremely complex. Thus, building a care process model for a special customer is cost prohibitive. In addition, during the execution of a care process model, the information system should have flexibility to modify the model so that it adapts to changes in the healthcare process. Therefore, supporting the process in a flexible, cost-effective way is a key challenge for information technology. To meet this challenge, first, we analyze various kinds of knowledge used in process modeling, illustrate their characteristics, and detail their roles and effects in careflow modeling. Secondly, we propose a methodology to manage a lifecycle of the healthcare process modeling, with which models could be built gradually with convenience and efficiency. In this lifecycle, different levels of process models are established based on the kinds of knowledge involved, and the diffusion strategy of these process models is designed. Thirdly, architecture and prototype of the system supporting the process modeling and its lifecycle are given. This careflow system also considers the compatibility of legacy systems and authority problems. Finally, an example is provided to demonstrate implementation of the careflow system.

  13. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    NASA Astrophysics Data System (ADS)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  14. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  15. Resource Management Scheme Based on Ubiquitous Data Analysis

    PubMed Central

    Lee, Heung Ki; Jung, Jaehee

    2014-01-01

    Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692

  16. A framework of knowledge creation processes in participatory simulation of hospital work systems.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2017-04-01

    Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.

  17. Application of a digital high-speed camera and image processing system for investigations of short-term hypersonic fluids

    NASA Astrophysics Data System (ADS)

    Renken, Hartmut; Oelze, Holger W.; Rath, Hans J.

    1998-04-01

    The design and application of a digital high sped image data capturing system with a following image processing system applied to the Bremer Hochschul Hyperschallkanal BHHK is the content of this presentation. It is also the result of the cooperation between the departments aerodynamic and image processing at the ZARM-institute at the Drop Tower of Brennen. Similar systems are used by the combustion working group at ZARM and other external project partners. The BHHK, camera- and image storage system as well as the personal computer based image processing software are described next. Some examples of images taken at the BHHK are shown to illustrate the application. The new and very user-friendly Windows 32-bit system is capable to capture all camera data with a maximum pixel clock of 43 MHz and to process complete sequences of images in one step by using only one comfortable program.

  18. Data systems trade studies for a next generation sensor

    NASA Astrophysics Data System (ADS)

    Masuoka, Edward J.; Fleig, Albert J.

    1997-01-01

    Processing system designers must make substantial changes to accommodate current and anticipated improvements in remote sensing instruments.Increases in the spectral, radiometric and geometric resolution lead to data rates, processing loads and storage volumes which far exceed the ability of most current computer systems. To accommodate user expectations, the data must be processed and made available quickly in a convenient and easy to use form. This paper describes design trade-offs made in developing the processing system for the moderate resolution imaging spectroradiometer, MODIS, which will fly on the Earth Observing System's, AM-1 spacecraft to be launched in 1998. MODIS will have an average continuous date rate of 6.2 Mbps and require processing at 6.5 GFLOPS to produce 600GB of output products per day. Specific trade-offs occur in the areas of science software portability and usability of science products versus overall system performance and throughput.

  19. Measurement-based reliability/performability models

    NASA Technical Reports Server (NTRS)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  20. Status of the Body of Knowledge and Curriculum to Advance Systems Engineering (BKCASE (trademark)) Project

    DTIC Science & Technology

    2011-10-01

    Systems engineer- ing knowledge has also been documented through the standards bodies, most notably : • ISO /IEC/IEEE 15288, Systems Engineer- ing...System Life Cycle Processes, 2008 (see [10]). • ANSI/EIA 632, Processes for Engineering a System, (1998) • IEEE 1220, ISO /IEC 26702 Application...tion • United States Defense Acquisition Guidebook, Chapter 4, June 27, 2011 • IEEE/EIA 12207 , Software Life Cycle Processes, 2008 • United

  1. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  2. A Study of Novice Systems Analysis Problem Solving Behaviors Using Protocol Analysis

    DTIC Science & Technology

    1992-09-01

    conducted. Each subject was given the same task to perform. The task involved a case study (Appendix B) of a utility company’s customer order processing system...behavior (Ramesh, 1989). The task was to design a customer order processing system that utilized a centralized telephone answering service center...of the utility company’s customer order processing system that was developed based on information obtained by a large systems consulting firm during

  3. Information Processing in Cognition Process and New Artificial Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Zheng, Nanning; Xue, Jianru

    In this chapter, we discuss, in depth, visual information processing and a new artificial intelligent (AI) system that is based upon cognitive mechanisms. The relationship between a general model of intelligent systems and cognitive mechanisms is described, and in particular we explore visual information processing with selective attention. We also discuss a methodology for studying the new AI system and propose some important basic research issues that have emerged in the intersecting fields of cognitive science and information science. To this end, a new scheme for associative memory and a new architecture for an AI system with attractors of chaos are addressed.

  4. System Constellations as a Tool Supporting Organisational Learning and Change Processes

    ERIC Educational Resources Information Center

    Birkenkrahe, Marcus

    2008-01-01

    Originally developed in the context of family therapy, system constellations are introduced using an organisational learning and system theoretical framework. Constellations are systemic group interventions using a spatial representation of the system elements. They correspond to deutero-learning processes and use higher-order systemic thinking.…

  5. 21 CFR 111.110 - What quality control operations are required for laboratory operations associated with the...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... laboratory operations associated with the production and process control system? 111.110 Section 111.110 Food... OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control... production and process control system? Quality control operations for laboratory operations associated with...

  6. State Estimation for Linear Systems Driven Simultaneously by Wiener and Poisson Processes.

    DTIC Science & Technology

    1978-12-01

    The state estimation problem of linear stochastic systems driven simultaneously by Wiener and Poisson processes is considered, especially the case...where the incident intensities of the Poisson processes are low and the system is observed in an additive white Gaussian noise. The minimum mean squared

  7. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Access to records: Claims processing assessment systems. 431.834 Section 431.834 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834...

  8. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Access to records: Claims processing assessment systems. 431.834 Section 431.834 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834...

  9. Process for Managing and Customizing HPC Operating Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, David ML

    2014-04-02

    A process for maintaining a custom HPC operating system was developed at the Environmental Molecular Sciences Laboratory (EMSL) over the past ten years. This process is generic and flexible to manage continuous change as well as keep systems updated while managing communication through well defined pieces of software.

  10. 76 FR 70833 - National Emission Standards for Hazardous Air Pollutant Emissions for Primary Lead Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... Classification System. \\2\\ Maximum Achievable Control Technology. Table 2 is not intended to be exhaustive, but..., methods, systems, or techniques that reduce the volume of or eliminate HAP emissions through process changes, substitution of materials, or other modifications; enclose systems or processes to eliminate...

  11. 76 FR 16263 - Revision to Electric Reliability Organization Definition of Bulk Electric System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ...'s Reliability Standards Development Process, to revise its definition of the term ``bulk electric... definition of ``bulk electric system'' through the NERC Standards Development Process to address the... undertake the process of revising the bulk electric system definition to address the Commission's concerns...

  12. 76 FR 55944 - In the Matter of Certain Electronic Devices With Image Processing Systems, Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-09

    ... With Image Processing Systems, Components Thereof, and Associated Software; Notice of Commission... importation of certain electronic devices with image processing systems, components thereof, and associated... direct infringement is asserted and the accused article does not meet every limitation of the asserted...

  13. Lessons Learned in Process Reengineering at a Community College.

    ERIC Educational Resources Information Center

    Jaacks, Gayle E.; Kurtz, Michael

    1999-01-01

    Summarizes the successful reengineering of business processes to take full advantage of new functionality in a vendor system upgrade at Western Iowa Tech Community College. Suggests that to truly benefit from implementing new systems or major system upgrades, an institution must streamline processes, eliminate duplication of effort, and examine…

  14. Flat-plate solar-array project. Experimental process system development unit for producing semiconductor-grade silicon using the silane-to-silicon process

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.

  15. Materials requirements for optical processing and computing devices

    NASA Technical Reports Server (NTRS)

    Tanguay, A. R., Jr.

    1985-01-01

    Devices for optical processing and computing systems are discussed, with emphasis on the materials requirements imposed by functional constraints. Generalized optical processing and computing systems are described in order to identify principal categories of requisite components for complete system implementation. Three principal device categories are selected for analysis in some detail: spatial light modulators, volume holographic optical elements, and bistable optical devices. The implications for optical processing and computing systems of the materials requirements identified for these device categories are described, and directions for future research are proposed.

  16. Landsat 7 Science Data Processing: An Overview

    NASA Technical Reports Server (NTRS)

    Schweiss, Robert J.; Daniel, Nathaniel E.; Derrick, Deborah K.

    2000-01-01

    The Landsat 7 Science Data Processing System, developed by NASA for the Landsat 7 Project, provides the science data handling infrastructure used at the Earth Resources Observation Systems (EROS) Data Center (EDC) Landsat Data Handling Facility (DHF) of the United States Department of Interior, United States Geological Survey (USGS) located in Sioux Falls, South Dakota. This paper presents an overview of the Landsat 7 Science Data Processing System and details of the design, architecture, concept of operation, and management aspects of systems used in the processing of the Landsat 7 Science Data.

  17. Data acquisition and processing system for the HT-6M tokamak fusion experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Y.T.; Liu, G.C.; Pang, J.Q.

    1987-08-01

    This paper describes a high-speed data acquisition and processing system which has been successfully operated on the HT-6M tokamak fusion experimental device. The system collects, archives and analyzes up to 512 kilobytes of data from each shot of the experiment. A shot lasts 50-150 milliseconds and occurs every 5-10 minutes. The system consists of two PDP-11/24 computer systems. One PDP-11/24 is used for real-time data taking and on-line data analysis. It is based upon five CAMAC crates organized into a parallel branch. Another PDP-11/24 is used for off-line data processing. Both data acquisition software RSX-DAS and data processing software RSX-DAPmore » have modular, multi-tasking and concurrent processing features.« less

  18. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  19. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  20. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  1. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases with a geographical reference system that can be used to geolocate all database information. (e...

  2. Improvement of the Performance of an Electrocoagulation Process System Using Fuzzy Control of pH.

    PubMed

    Demirci, Yavuz; Pekel, Lutfiye Canan; Altinten, Ayla; Alpbaz, Mustafa

    2015-12-01

    The removal efficiencies of electrocoagulation (EC) systems are highly dependent on the initial value of pH. If an EC system has an acidic influent, the pH of the effluent increases during the treatment process; conversely, if such a system has an alkaline influent, the pH of the effluent decreases during the treatment process. Thus, changes in the pH of the wastewater affect the efficiency of the EC process. In this study, we investigated the dynamic effects of pH. To evaluate approaches for preventing increases in the pH of the system, the MATLAB/Simulink program was used to develop and evaluate an on-line computer-based system for pH control. The aim of this work was to study Proportional-Integral-Derivative (PID) control and fuzzy control of the pH of a real textile wastewater purification process using EC. The performances and dynamic behaviors of these two control systems were evaluated based on determinations of COD, colour, and turbidity removal efficiencies.

  3. Hydrothermal Gasification for Waste to Energy

    NASA Astrophysics Data System (ADS)

    Epps, Brenden; Laser, Mark; Choo, Yeunun

    2014-11-01

    Hydrothermal gasification is a promising technology for harvesting energy from waste streams. Applications range from straightforward waste-to-energy conversion (e.g. municipal waste processing, industrial waste processing), to water purification (e.g. oil spill cleanup, wastewater treatment), to biofuel energy systems (e.g. using algae as feedstock). Products of the gasification process are electricity, bottled syngas (H2 + CO), sequestered CO2, clean water, and inorganic solids; further chemical reactions can be used to create biofuels such as ethanol and biodiesel. We present a comparison of gasification system architectures, focusing on efficiency and economic performance metrics. Various system architectures are modeled computationally, using a model developed by the coauthors. The physical model tracks the mass of each chemical species, as well as energy conversions and transfers throughout the gasification process. The generic system model includes the feedstock, gasification reactor, heat recovery system, pressure reducing mechanical expanders, and electricity generation system. Sensitivity analysis of system performance to various process parameters is presented. A discussion of the key technological barriers and necessary innovations is also presented.

  4. Automated inspection of hot steel slabs

    DOEpatents

    Martin, R.J.

    1985-12-24

    The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes. 5 figs.

  5. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  6. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  7. Automated inspection of hot steel slabs

    DOEpatents

    Martin, Ronald J.

    1985-01-01

    The disclosure relates to a real time digital image enhancement system for performing the image enhancement segmentation processing required for a real time automated system for detecting and classifying surface imperfections in hot steel slabs. The system provides for simultaneous execution of edge detection processing and intensity threshold processing in parallel on the same image data produced by a sensor device such as a scanning camera. The results of each process are utilized to validate the results of the other process and a resulting image is generated that contains only corresponding segmentation that is produced by both processes.

  8. Architectures Toward Reusable Science Data Systems

    NASA Technical Reports Server (NTRS)

    Moses, John Firor

    2014-01-01

    Science Data Systems (SDS) comprise an important class of data processing systems that support product generation from remote sensors and in-situ observations. These systems enable research into new science data products, replication of experiments and verification of results. NASA has been building systems for satellite data processing since the first Earth observing satellites launched and is continuing development of systems to support NASA science research and NOAA's Earth observing satellite operations. The basic data processing workflows and scenarios continue to be valid for remote sensor observations research as well as for the complex multi-instrument operational satellite data systems being built today.

  9. Supporting Reflective Activities in Information Seeking on the Web

    NASA Astrophysics Data System (ADS)

    Saito, Hitomi; Miwa, Kazuhisa

    Recently, many opportunities have emerged to use the Internet in daily life and classrooms. However, with the growth of the World Wide Web (Web), it is becoming increasingly difficult to find target information on the Internet. In this study, we explore a method for developing the ability of users in information seeking on the Web and construct a search process feedback system supporting reflective activities of information seeking on the Web. Reflection is defined as a cognitive activity for monitoring, evaluating, and modifying one's thinking and process. In the field of learning science, many researchers have investigated reflective activities that facilitate learners' problem solving and deep understanding. The characteristics of this system are: (1) to show learners' search processes on the Web as described, based on a cognitive schema, and (2) to prompt learners to reflect on their search processes. We expect that users of this system can reflect on their search processes by receiving information on their own search processes provided by the system, and that these types of reflective activity helps them to deepen their understanding of information seeking activities. We have conducted an experiment to investigate the effects of our system. The experimental results confirmed that (1) the system actually facilitated the learners' reflective activities by providing process visualization and prompts, and (2) the learners who reflected on their search processes more actively understood their own search processes more deeply.

  10. Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data

    DTIC Science & Technology

    2017-01-01

    files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...ER D C/ CH L TR -1 7- 2 Coastal Inlets Research Program Tidal Analysis and Arrival Process Mining Using Automatic Identification System...17-2 January 2017 Tidal Analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data Brandan M. Scully Coastal and

  11. Theoretical test of Jarzynski's equality for reversible volume-switching processes of an ideal gas system.

    PubMed

    Sung, Jaeyoung

    2007-07-01

    We present an exact theoretical test of Jarzynski's equality (JE) for reversible volume-switching processes of an ideal gas system. The exact analysis shows that the prediction of JE for the free energy difference is the same as the work done on the gas system during the reversible process that is dependent on the shape of path of the reversible volume-switching process.

  12. Evaluation of microbial stability, bioactive compounds, physicochemical properties, and consumer acceptance of pomegranate juice processed in a commercial scale pulsed electric field system

    USDA-ARS?s Scientific Manuscript database

    This paper investigated the feasibility for pasteurizing raw pomegranate juice in a commercial scale pulsed electric field (PEF) system. The juice was processed in a commercial scale PEF processing system at 35 and 38 kV/cm for 281 µs at 55 degree C with a flow rate of 100 L/h. Effect of PEF process...

  13. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  14. Process of activation of a palladium catalyst system

    DOEpatents

    Sobolevskiy, Anatoly [Orlando, FL; Rossin, Joseph A [Columbus, OH; Knapke, Michael J [Columbus, OH

    2011-08-02

    Improved processes for activating a catalyst system used for the reduction of nitrogen oxides are provided. In one embodiment, the catalyst system is activated by passing an activation gas stream having an amount of each of oxygen, water vapor, nitrogen oxides, and hydrogen over the catalyst system and increasing a temperature of the catalyst system to a temperature of at least 180.degree. C. at a heating rate of from 1-20.degree./min. Use of activation processes described herein leads to a catalyst system with superior NOx reduction capabilities.

  15. System approach to modeling of industrial technologies

    NASA Astrophysics Data System (ADS)

    Toropov, V. S.; Toropov, E. S.

    2018-03-01

    The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washiya, Tadahiro; Komaki, Jun; Funasaka, Hideyuki

    Japan Atomic Energy Agency (JAEA) has been developing the new aqueous reprocessing system named 'NEXT' (New Extraction system for TRU recovery)1-2, which provides many advantages as waste volume reduction, cost savings by advanced components and simplification of process operation. Advanced head-end systems in the 'NEXT' process consist of fuel disassembly system, fuel shearing system and continuous dissolver system. We developed reliable fuel disassembly system with innovative procedure, and short-length shearing system and continuous dissolver system can be provided highly concentrated dissolution to adapt to the uranium crystallization process. We have carried out experimental studies, and fabrication of engineering-scale test devicesmore » to confirm the systems performance. In this paper, research and development of advanced head-end systems are described. (authors)« less

  17. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    PubMed

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  18. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems

    PubMed Central

    2011-01-01

    Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688

  19. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  20. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    NASA Astrophysics Data System (ADS)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  1. The Effectiveness of Full Day School System for Students’ Character Building

    NASA Astrophysics Data System (ADS)

    Benawa, A.; Peter, R.; Makmun, S.

    2018-01-01

    The study aims to put forward that full day school which was delivered in Marsudirini Elementary School in Bogor is effective for students’ character building. The study focused on the implementation of full day school system. The qualitative-based research method applied in the study is characteristic evaluation involving non-participant observation, interview, and documentation analysis. The result of this study concludes that the full day school system is significantly effective in education system for elementary students’ character building. The full day school system embraced the entire relevant processes based on the character building standard. The synergy of comprehensive components in instructional process at full day school has influenced the building of the students’ character effectively and efficiently. The relationship emerged between instructional development process in full day school system and the character building of the students. By developing instructional process through systemic and systematic process in full day school system, the support of stakeholders (leaders, human resources, students, parents’ role) and other components (learning resources, facilities, budget) provides a potent and expeditious contribution for character building among the students eventually.

  2. The Hyperspectral Imager for the Coastal Ocean (HICO): Sensor and Data Processing Overview

    DTIC Science & Technology

    2010-01-20

    backscattering coefficients, and others. Several of these software modules will be developed within the Automated Processing System (APS), a data... Automated Processing System (APS) NRL developed APS, which processes satellite data into ocean color data products. APS is a collection of methods...used for ocean color processing which provide the tools for the automated processing of satellite imagery [1]. These tools are in the process of

  3. Laser metrology in food-related systems

    NASA Astrophysics Data System (ADS)

    Mendoza-Sanchez, Patricia; Lopez, Daniel; Kongraksawech, Teepakorn; Vazquez, Pedro; Torres, J. Antonio; Ramirez, Jose A.; Huerta-Ruelas, Jorge

    2005-02-01

    An optical system was developed using a low-cost semiconductor laser and commercial optical and electronic components, to monitor food processes by measuring changes in optical rotation (OR) of chiral compounds. The OR signal as a function of processing time and sample temperature were collected and recorded using a computer data acquisition system. System has been tested during two different processes: sugar-protein interaction and, beer fermentation process. To study sugar-protein interaction, the following sugars were used: sorbitol, trehalose and sucrose, and in the place of Protein, Serum Albumin Bovine (BSA, A-7906 Sigma-Aldrich). In some food processes, different sugars are added to protect damage of proteins during their processing, storage and/or distribution. Different sugar/protein solutions were prepared and heated above critical temperature of protein denaturation. OR measurements were performed during heating process and effect of different sugars in protein denaturation was measured. Higher sensitivity of these measurements was found compared with Differential Scanning Calorimetry, which needs higher protein concentration to study these interactions. The brewing fermentation process was monitored in-situ using this OR system and validated by correlation with specific density measurements and gas chromatography. This instrument can be implemented to monitor fermentation on-line, thereby determining end of process and optimizing process conditions in an industrial setting. The high sensitivity of developed OR system has no mobile parts and is more flexible than commercial polarimeters providing the capability of implementation in harsh environments, signifying the potential of this method as an in-line technique for quality control in food processing and for experimentation with optically active solutions.

  4. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  5. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  6. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  7. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases with a common or coordinated reference system, that can be used to geolocate all database information...

  8. The DFVLR main department for central data processing, 1976 - 1983

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Data processing, equipment and systems operation, operative and user systems, user services, computer networks and communications, text processing, computer graphics, and high power computers are discussed.

  9. Automated information system for analysis and prediction of production situations in blast furnace plant

    NASA Astrophysics Data System (ADS)

    Lavrov, V. V.; Spirin, N. A.

    2016-09-01

    Advances in modern science and technology are inherently connected with the development, implementation, and widespread use of computer systems based on mathematical modeling. Algorithms and computer systems are gaining practical significance solving a range of process tasks in metallurgy of MES-level (Manufacturing Execution Systems - systems controlling industrial process) of modern automated information systems at the largest iron and steel enterprises in Russia. This fact determines the necessity to develop information-modeling systems based on mathematical models that will take into account the physics of the process, the basics of heat and mass exchange, the laws of energy conservation, and also the peculiarities of the impact of technological and standard characteristics of raw materials on the manufacturing process data. Special attention in this set of operations for metallurgic production is devoted to blast-furnace production, as it consumes the greatest amount of energy, up to 50% of the fuel used in ferrous metallurgy. The paper deals with the requirements, structure and architecture of BF Process Engineer's Automated Workstation (AWS), a computer decision support system of MES Level implemented in the ICS of the Blast Furnace Plant at Magnitogorsk Iron and Steel Works. It presents a brief description of main model subsystems as well as assumptions made in the process of mathematical modelling. Application of the developed system allows the engineering and process staff to analyze online production situations in the blast furnace plant, to solve a number of process tasks related to control of heat, gas dynamics and slag conditions of blast-furnace smelting as well as to calculate the optimal composition of blast-furnace slag, which eventually results in increasing technical and economic performance of blast-furnace production.

  10. Learner Performance Accounting: A Tri-Cycle Process

    ERIC Educational Resources Information Center

    Brown, Thomas C.; McCleary, Lloyd E.

    1973-01-01

    The Tri-Cycle Process described in the model permits for the first time an integrated system for designing an individualized instructional system that would permit a rational, diagnosis-prescription-evaluation system keyed to an accounting system. (Author)

  11. High Available COTS Based Computer for Space

    NASA Astrophysics Data System (ADS)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  12. The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies

    NASA Technical Reports Server (NTRS)

    Mulqueen, Jack; Jones, David; Hopkins, Randy

    2011-01-01

    This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.

  13. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  14. Method and system for environmentally adaptive fault tolerant computing

    NASA Technical Reports Server (NTRS)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  15. Process control systems at Homer City coal preparation plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shell, W.P.

    1983-03-01

    An important part of process control engineering is the implementation of the basic control system design through commissioning to routine operation. This is a period when basic concepts can be reviewed and improvements either implemented or recorded for application in future systems. The experience of commissioning the process control systems in the Homer City coal cleaning plant are described and discussed. The current level of operating control performance in individual sections and the overall system are also reported and discussed.

  16. Investigation of charge coupled device correlation techniques

    NASA Technical Reports Server (NTRS)

    Lampe, D. R.; Lin, H. C.; Shutt, T. J.

    1978-01-01

    Analog Charge Transfer Devices (CTD's) offer unique advantages to signal processing systems, which often have large development costs, making it desirable to define those devices which can be developed for general system's use. Such devices are best identified and developed early to give system's designers some interchangeable subsystem blocks, not requiring additional individual development for each new signal processing system. The objective of this work is to describe a discrete analog signal processing device with a reasonably broad system use and to implement its design, fabrication, and testing.

  17. A radar data processing and enhancement system

    NASA Technical Reports Server (NTRS)

    Anderson, K. F.; Wrin, J. W.; James, R.

    1986-01-01

    This report describes the space position data processing system of the NASA Western Aeronautical Test Range. The system is installed at the Dryden Flight Research Facility of NASA Ames Research Center. This operational radar data system (RADATS) provides simultaneous data processing for multiple data inputs and tracking and antenna pointing outputs while performing real-time monitoring, control, and data enhancement functions. Experience in support of the space shuttle and aeronautical flight research missions is described, as well as the automated calibration and configuration functions of the system.

  18. The COMPTEL Processing and Analysis Software system (COMPASS)

    NASA Astrophysics Data System (ADS)

    de Vries, C. P.; COMPTEL Collaboration

    The data analysis system of the gamma-ray Compton Telescope (COMPTEL) onboard the Compton-GRO spacecraft is described. A continous stream of data of the order of 1 kbytes per second is generated by the instrument. The data processing and analysis software is build around a relational database managment system (RDBMS) in order to be able to trace heritage and processing status of all data in the processing pipeline. Four institutes cooperate in this effort requiring procedures to keep local RDBMS contents identical between the sites and swift exchange of data using network facilities. Lately, there has been a gradual move of the system from central processing facilities towards clusters of workstations.

  19. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    NASA Technical Reports Server (NTRS)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  20. System-wide hybrid MPC-PID control of a continuous pharmaceutical tablet manufacturing process via direct compaction.

    PubMed

    Singh, Ravendra; Ierapetritou, Marianthi; Ramachandran, Rohit

    2013-11-01

    The next generation of QbD based pharmaceutical products will be manufactured through continuous processing. This will allow the integration of online/inline monitoring tools, coupled with an efficient advanced model-based feedback control systems, to achieve precise control of process variables, so that the predefined product quality can be achieved consistently. The direct compaction process considered in this study is highly interactive and involves time delays for a number of process variables due to sensor placements, process equipment dimensions, and the flow characteristics of the solid material. A simple feedback regulatory control system (e.g., PI(D)) by itself may not be sufficient to achieve the tight process control that is mandated by regulatory authorities. The process presented herein comprises of coupled dynamics involving slow and fast responses, indicating the requirement of a hybrid control scheme such as a combined MPC-PID control scheme. In this manuscript, an efficient system-wide hybrid control strategy for an integrated continuous pharmaceutical tablet manufacturing process via direct compaction has been designed. The designed control system is a hybrid scheme of MPC-PID control. An effective controller parameter tuning strategy involving an ITAE method coupled with an optimization strategy has been used for tuning of both MPC and PID parameters. The designed hybrid control system has been implemented in a first-principles model-based flowsheet that was simulated in gPROMS (Process System Enterprise). Results demonstrate enhanced performance of critical quality attributes (CQAs) under the hybrid control scheme compared to only PID or MPC control schemes, illustrating the potential of a hybrid control scheme in improving pharmaceutical manufacturing operations. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Containerless processing of undercooled melts

    NASA Technical Reports Server (NTRS)

    Perepezko, J. H.

    1993-01-01

    The investigation focused on the control of microstructural evolution in Mn-Al, Fe-Ni, Ni-V, and Au-Pb-Sb alloys through the high undercooling levels provided by containerless processing, and provided fundamental new information on the control of nucleation. Solidification analysis was conducted by means of thermal analysis, x-ray diffraction, and metallographic characterization on samples processed in a laboratory scale drop tube system. The Mn-Al alloy system offers a useful model system with the capability of phase separation on an individual particle basis, thus permitting a more complete understanding of the operative kinetics and the key containerless processing variables. This system provided the opportunity of analyzing the nucleation rate as a function of processing conditions and allowed for the quantitative assessment of the relevant processing parameters. These factors are essential in the development of a containerless processing model which has a predictive capability. Similarly, Ni-V is a model system that was used to study duplex partitionless solidification, which is a structure possible only in high under cooling solidification processes. Nucleation kinetics for the competing bcc and fcc phases were studied to determine how this structure can develop and the conditions under which it may occur. The Fe-Ni alloy system was studied to identify microstructural transitions with controlled variations in sample size and composition during containerless solidification. This work was forwarded to develop a microstructure map which delineates regimes of structural evolution and provides a unified analysis of experimental observations. The Au-Pb-Sb system was investigated to characterize the thermodynamic properties of the undercooled liquid phase and to characterize the glass transition under a variety of processing conditions. By analyzing key containerless processing parameters in a ground based drop tube study, a carefully designed flight experiment may be planned to utilize the extended duration microgravity conditions of orbiting spacecraft.

  2. A synthesis of drug reimbursement decision-making processes in organisation for economic co-operation and development countries.

    PubMed

    Barnieh, Lianne; Manns, Braden; Harris, Anthony; Blom, Marja; Donaldson, Cam; Klarenbach, Scott; Husereau, Don; Lorenzetti, Diane; Clement, Fiona

    2014-01-01

    The use of a restrictive formulary, with placement determined through a drug-reimbursement decision-making process, is one approach to managing drug expenditures. To describe the processes in drug reimbursement decision-making systems currently used in national publicly funded outpatient prescription drug insurance plans. By using the Organisation for Economic Co-operation and Development (OECD) nations as the sampling frame, a search was done in the published literature, followed by the gray literature. Collected data were verified by a system expert within the prescription drug insurance plan in each country to ensure the accuracy of key data elements across countries. All but one country provided at least one publicly funded prescription drug formulary. Many systems have adopted similar processes of drug reimbursement decision making. All but three systems required additional consideration of clinical evidence within the decision-making process. Transparency of recommendations varied between systems, from having no information publicly available (three systems) to all information available and accessible to the public (16 systems). Only four countries did not consider cost within the drug reimbursement decision-making process. There were similarities in the decision-making process for drug reimbursement across the systems; however, only five countries met the highest standard of transparency, requirement of evidence, and ability to appeal. Future work should focus on examining how these processes may affect formulary listing decisions for drugs between countries. © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR) Published by International Society for Pharmacoeconomics and Outcomes Research (ISPOR) All rights reserved.

  3. An integral design strategy combining optical system and image processing to obtain high resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun

    2016-05-01

    In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.

  4. A dynamic dual process model of risky decision making.

    PubMed

    Diederich, Adele; Trueblood, Jennifer S

    2018-03-01

    Many phenomena in judgment and decision making are often attributed to the interaction of 2 systems of reasoning. Although these so-called dual process theories can explain many types of behavior, they are rarely formalized as mathematical or computational models. Rather, dual process models are typically verbal theories, which are difficult to conclusively evaluate or test. In the cases in which formal (i.e., mathematical) dual process models have been proposed, they have not been quantitatively fit to experimental data and are often silent when it comes to the timing of the 2 systems. In the current article, we present a dynamic dual process model framework of risky decision making that provides an account of the timing and interaction of the 2 systems and can explain both choice and response-time data. We outline several predictions of the model, including how changes in the timing of the 2 systems as well as time pressure can influence behavior. The framework also allows us to explore different assumptions about how preferences are constructed by the 2 systems as well as the dynamic interaction of the 2 systems. In particular, we examine 3 different possible functional forms of the 2 systems and 2 possible ways the systems can interact (simultaneously or serially). We compare these dual process models with 2 single process models using risky decision making data from Guo, Trueblood, and Diederich (2017). Using this data, we find that 1 of the dual process models significantly outperforms the other models in accounting for both choices and response times. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. Towards a multi-level approach to the emergence of meaning processes in living systems.

    PubMed

    Queiroz, João; El-Hani, Charbel Niño

    2006-09-01

    Any description of the emergence and evolution of different types of meaning processes (semiosis, sensu C.S.Peirce) in living systems must be supported by a theoretical framework which makes it possible to understand the nature and dynamics of such processes. Here we propose that the emergence of semiosis of different kinds can be understood as resulting from fundamental interactions in a triadically-organized hierarchical process. To grasp these interactions, we develop a model grounded on Stanley Salthe's hierarchical structuralism. This model can be applied to establish, in a general sense, a set of theoretical constraints for explaining the instantiation of different kinds of meaning processes (iconic, indexical, symbolic) in semiotic systems. We use it to model a semiotic process in the immune system, namely, B-cell activation, in order to offer insights into the heuristic role it can play in the development of explanations for specific semiotic processes.

  6. Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems

    NASA Technical Reports Server (NTRS)

    Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.

    2012-01-01

    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA). These dynamic models were developed using the Aspen Custom Modeler (Registered TradeMark) and Aspen Plus(Registered TradeMark) process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  7. Lessons Learned From Developing Three Generations of Remote Sensing Science Data Processing Systems

    NASA Technical Reports Server (NTRS)

    Tilmes, Curt; Fleig, Albert J.

    2005-01-01

    The Biospheric Information Systems Branch at NASA s Goddard Space Flight Center has developed three generations of Science Investigator-led Processing Systems for use with various remote sensing instruments. The first system is used for data from the MODIS instruments flown on NASA s Earth Observing Systems @OS) Terra and Aqua Spacecraft launched in 1999 and 2002 respectively. The second generation is for the Ozone Measuring Instrument flying on the EOS Aura spacecraft launched in 2004. We are now developing a third generation of the system for evaluation science data processing for the Ozone Mapping and Profiler Suite (OMPS) to be flown by the NPOESS Preparatory Project (NPP) in 2006. The initial system was based on large scale proprietary hardware, operating and database systems. The current OMI system and the OMPS system being developed are based on commodity hardware, the LINUX Operating System and on PostgreSQL, an Open Source RDBMS. The new system distributes its data archive across multiple server hosts and processes jobs on multiple processor boxes. We have created several instances of this system, including one for operational processing, one for testing and reprocessing and one for applications development and scientific analysis. Prior to receiving the first data from OMI we applied the system to reprocessing information from the Solar Backscatter Ultraviolet (SBUV) and Total Ozone Mapping Spectrometer (TOMS) instruments flown from 1978 until now. The system was able to process 25 years (108,000 orbits) of data and produce 800,000 files (400 GiB) of level 2 and level 3 products in less than a week. We will describe the lessons we have learned and tradeoffs between system design, hardware, operating systems, operational staffing, user support and operational procedures. During each generational phase, the system has become more generic and reusable. While the system is not currently shrink wrapped we believe it is to the point where it could be readily adopted, with substantial cost savings, for other similar tasks.

  8. High speed real-time wavefront processing system for a solid-state laser system

    NASA Astrophysics Data System (ADS)

    Liu, Yuan; Yang, Ping; Chen, Shanqiu; Ma, Lifang; Xu, Bing

    2008-03-01

    A high speed real-time wavefront processing system for a solid-state laser beam cleanup system has been built. This system consists of a core2 Industrial PC (IPC) using Linux and real-time Linux (RT-Linux) operation system (OS), a PCI image grabber, a D/A card. More often than not, the phase aberrations of the output beam from solid-state lasers vary fast with intracavity thermal effects and environmental influence. To compensate the phase aberrations of solid-state lasers successfully, a high speed real-time wavefront processing system is presented. Compared to former systems, this system can improve the speed efficiently. In the new system, the acquisition of image data, the output of control voltage data and the implementation of reconstructor control algorithm are treated as real-time tasks in kernel-space, the display of wavefront information and man-machine conversation are treated as non real-time tasks in user-space. The parallel processing of real-time tasks in Symmetric Multi Processors (SMP) mode is the main strategy of improving the speed. In this paper, the performance and efficiency of this wavefront processing system are analyzed. The opened-loop experimental results show that the sampling frequency of this system is up to 3300Hz, and this system can well deal with phase aberrations from solid-state lasers.

  9. Measuring information processing in a client with extreme agitation following traumatic brain injury using the Perceive, Recall, Plan and Perform System of Task Analysis.

    PubMed

    Nott, Melissa T; Chapparo, Christine

    2008-09-01

    Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.

  10. The Gemini Recipe System: A Dynamic Workflow for Automated Data Reduction

    NASA Astrophysics Data System (ADS)

    Labrie, K.; Hirst, P.; Allen, C.

    2011-07-01

    Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. Building on concepts that originated in ORAC-DR, a data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps called Primitives. The Primitives are written in Python and can be launched from the PyRAF user interface by users wishing for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines. All observatory or instrument specific definitions are isolated from the core of the AstroData system and distributed in external configuration packages that define a lexicon including classifications, uniform metadata elements, and transformations.

  11. General overview of an integrated lunar oxygen production/brickmaking system

    NASA Technical Reports Server (NTRS)

    Altemir, D. A.

    1993-01-01

    On the moon, various processing systems would compete for the same resources, most notably power, raw materials, and perhaps human attention. Therefore, it may be advantageous for two or more processes to be combined such that the integrated system would require fewer resources than separate systems working independently. The synergistic marriage of two such processes--lunar oxygen production and the manufacture of bricks from sintered lunar regolith--is considered.

  12. Systems Engineering Processes Applied to Ground Vehicle Integration at US Army Tank Automotive Research, Development, and Engineering Center (TARDEC)

    DTIC Science & Technology

    2010-08-19

    UNCLASSIFIED Systems Engineering Processes Applied To Ground Vehicle Integration at US Army Tank Automotive Research, Development, and Engineering...DATES COVERED - 4. TITLE AND SUBTITLE Systems Engineering Processes Applied To Ground Vehicle Integration at US Army Tank Automotive Research...release, distribution unlimited 13. SUPPLEMENTARY NOTES Presented at NDIAs Ground Vehicle Systems Engineering and Technology Symposium (GVSETS), 17 22

  13. Quality Assurance By Laser Scanning And Imaging Techniques

    NASA Astrophysics Data System (ADS)

    SchmalfuB, Harald J.; Schinner, Karl Ludwig

    1989-03-01

    Laser scanning systems are well established in the world of fast industrial in-process quality inspection systems. The materials inspected by laser scanning systems are e.g. "endless" sheets of steel, paper, textile, film or foils. The web width varies from 50 mm up to 5000 mm or more. The web speed depends strongly on the production process and can reach several hundred meters per minute. The continuous data flow in one of different channels of the optical receiving system exceeds ten Megapixels/sec. Therefore it is clear that the electronic evaluation system has to process these data streams in real time and no image storage is possible. But sometimes (e.g. first installation of the system, change of the defect classification) it would be very helpful to have the possibility for a visual look on the original, i.e. not processed sensor data. At first we show the principle set up of a standard laser scanning system. Then we will introduce a large image memory especially designed for the needs of high-speed inspection sensors. This image memory co-operates with the standard on-line evaluation electronics and provides therefore an easy comparison between processed and non-processed data. We will discuss the basic system structure and we will show the first industrial results.

  14. EARSEC SAR processing system

    NASA Astrophysics Data System (ADS)

    Protheroe, Mark; Sloggett, David R.; Sieber, Alois J.

    1994-12-01

    Traditionally, the production of high quality Synthetic Aperture Radar imagery has been an area where a potential user would have to expend large amounts of money in either the bespoke development of a processing chain dedicated to his requirements or in the purchase of a dedicated hardware platform adapted using accelerator boards and enhanced memory management. Whichever option the user adopted there were limitations based on the desire for a realistic throughput in data load and time. The user had a choice, made early in the purchase, for either a system that adopted innovative algorithmic manipulation, to limit the processing time of the purchase of expensive hardware. The former limits the quality of the product, while the latter excludes the user from any visibility into the processing chain. Clearly there was a need for a SAR processing architecture that gave the user a choice into the methodology to be adopted for a particular processing sequence, allowing him to decide on either a quick (lower quality) product or a detailed slower (high quality) product, without having to change the algorithmic base of his processor or the hardware platform. The European Commission, through the Advanced Techniques unit of the Joint Research Centre (JRC) Institute for Remote Sensing at Ispra in Italy, realizing the limitations on current processing abilities, initiated its own program to build airborne SAR and Electro-Optical (EO) sensor systems. This program is called the European Airborne Remote Sensing Capabilities (EARSEC) program. This paper describes the processing system developed for the airborne SAR sensor system. The paper considers the requirements for the system and the design of the EARSEC Airborne SAR Processing System. It highlights the development of an open SAR processing architecture where users have full access to intermediate products that arise from each of the major processing stages. It also describes the main processing stages in the overall architecture and illustrates the results of each of the key stages in the processor.

  15. Machine vision process monitoring on a poultry processing kill line: results from an implementation

    NASA Astrophysics Data System (ADS)

    Usher, Colin; Britton, Dougl; Daley, Wayne; Stewart, John

    2005-11-01

    Researchers at the Georgia Tech Research Institute designed a vision inspection system for poultry kill line sorting with the potential for process control at various points throughout a processing facility. This system has been successfully operating in a plant for over two and a half years and has been shown to provide multiple benefits. With the introduction of HACCP-Based Inspection Models (HIMP), the opportunity for automated inspection systems to emerge as viable alternatives to human screening is promising. As more plants move to HIMP, these systems have the great potential for augmenting a processing facilities visual inspection process. This will help to maintain a more consistent and potentially higher throughput while helping the plant remain within the HIMP performance standards. In recent years, several vision systems have been designed to analyze the exterior of a chicken and are capable of identifying Food Safety 1 (FS1) type defects under HIMP regulatory specifications. This means that a reliable vision system can be used in a processing facility as a carcass sorter to automatically detect and divert product that is not suitable for further processing. This improves the evisceration line efficiency by creating a smaller set of features that human screeners are required to identify. This can reduce the required number of screeners or allow for faster processing line speeds. In addition to identifying FS1 category defects, the Georgia Tech vision system can also identify multiple "Other Consumer Protection" (OCP) category defects such as skin tears, bruises, broken wings, and cadavers. Monitoring this data in an almost real-time system allows the processing facility to address anomalies as soon as they occur. The Georgia Tech vision system can record minute-by-minute averages of the following defects: Septicemia Toxemia, cadaver, over-scald, bruises, skin tears, and broken wings. In addition to these defects, the system also records the length and width information of the entire chicken and different parts such as the breast, the legs, the wings, and the neck. The system also records average color and miss- hung birds, which can cause problems in further processing. Other relevant production information is also recorded including truck arrival and offloading times, catching crew and flock serviceman data, the grower, the breed of chicken, and the number of dead-on- arrival (DOA) birds per truck. Several interesting observations from the Georgia Tech vision system, which has been installed in a poultry processing plant for several years, are presented. Trend analysis has been performed on the performance of the catching crews and flock serviceman, and the results of the processed chicken as they relate to the bird dimensions and equipment settings in the plant. The results have allowed researchers and plant personnel to identify potential areas for improvement in the processing operation, which should result in improved efficiency and yield.

  16. System Analysis in Instructional Programming: The Initial Phases of the Program Construction Process.

    ERIC Educational Resources Information Center

    Bjerstedt, Ake

    A three-volume series describes the construction of a self-instructional system as a work process with three main phases: system analysis, system synthesis, and system modification and evaluation. After an introductory discussion of some basic principles of instructional programing, this first volume focuses on the system analysis phase,…

  17. Bioregenerative technologies for waste processing and resource recovery in advanced space life support system

    NASA Technical Reports Server (NTRS)

    Chamberland, Dennis

    1991-01-01

    The Controlled Ecological Life Support System (CELSS) for producing oxygen, water, and food in space will require an interactive facility to process and return wastes as resources to the system. This paper examines the bioregenerative techologies for waste processing and resource recovery considered for a CELSS Resource Recovery system. The components of this system consist of a series of biological reactors to treat the liquid and solid material fractions, in which the aerobic and anaerobic reactors are combined in a block called the Combined Reactor Equipment (CORE) block. The CORE block accepts the human wastes, kitchen wastes, inedible refractory plant materials, grey waters from the CELLS system, and aquaculture solids and processes these materials in either aerobic or anaerobic reactors depending on the desired product and the rates required by the integrated system.

  18. Some technical considerations on the evolution of the IBIS system. [Image Based Information System

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Zobrist, A. L.

    1982-01-01

    In connection with work related to the use of earth-resources images, it became apparent by 1974, that certain system improvements are necessary for the efficient processing of digital data. To resolve this dilemma, Billingsley and Bryant (1975) proposed the use of image processing technology. Bryant and Zobrist (1976) reported the development of the Image Based Information System (IBIS) as a subset of an overall Video Image Communication and Retrieval (VICAR) image processing system. A description of IBIS is presented, and its employment in connection with advanced applications is discussed. It is concluded that several important lessons have been learned from the development of IBIS. The development of a flexible system such as IBIS is found to rest upon the prior development of a general purpose image processing system, such as VICAR.

  19. Real-time system for imaging and object detection with a multistatic GPR array

    DOEpatents

    Paglieroni, David W; Beer, N Reginald; Bond, Steven W; Top, Philip L; Chambers, David H; Mast, Jeffrey E; Donetti, John G; Mason, Blake C; Jones, Steven M

    2014-10-07

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  20. 40 CFR 63.149 - Control requirements for certain liquid streams in open systems within a chemical manufacturing...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... streams in open systems within a chemical manufacturing process unit. 63.149 Section 63.149 Protection of... open systems within a chemical manufacturing process unit. (a) The owner or operator shall comply with... Air Pollutants From the Synthetic Organic Chemical Manufacturing Industry for Process Vents, Storage...

  1. The Policy-Making Process of the State University System of Florida.

    ERIC Educational Resources Information Center

    Sullivan, Sandra M.

    The policy-making process of the State University System of Florida is described using David Easton's model of a political system as the conceptual framwork. Two models describing the policy-making process were developed from personal interviews with the primary participants in the governance structure and from three case studies of policy…

  2. 75 FR 38118 - In the Matter of Certain Electronic Devices With Image Processing Systems, Components Thereof...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-01

    ... With Image Processing Systems, Components Thereof, and Associated Software; Notice of Investigation..., and associated software by reason of infringement of certain claims of U.S. Patent Nos. 7,043,087... processing systems, components thereof, and associated software that infringe one or more of claims 1, 6, and...

  3. 76 FR 27114 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-10

    ... CRD Processing Fee, the NASD Annual System Processing Fee, and the NYSE Arca Transfer/Re-license... Fees, the NASD Annual System Processing Fee, and the NYSE Arca Transfer/Re-license Individual Fee. Fees... Options Regulatory Surveillance Authority (``ORSA'') national market system plan and in doing so shares...

  4. Prototyping an automated lumber processing system

    Treesearch

    Powsiri Klinkhachorn; Ravi Kothari; Henry A. Huber; Charles W. McMillin; K. Mukherjee; V. Barnekov

    1993-01-01

    The Automated Lumber Processing System (ALPS)is a multi-disciplinary continuing effort directed toward increasing the yield obtained from hardwood lumber boards during their process of remanufacture into secondary products (furniture, etc.). ALPS proposes a nondestructive vision system to scan a board for its dimension and the location and expanse of surface defects on...

  5. Sleep, Off-Line Processing, and Vocal Learning

    ERIC Educational Resources Information Center

    Margoliash, Daniel; Schmidt, Marc F.

    2010-01-01

    The study of song learning and the neural song system has provided an important comparative model system for the study of speech and language acquisition. We describe some recent advances in the bird song system, focusing on the role of off-line processing including sleep in processing sensory information and in guiding developmental song…

  6. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  7. Dynamic control and information processing in chemical reaction systems by tuning self-organization behavior

    NASA Astrophysics Data System (ADS)

    Lebiedz, Dirk; Brandt-Pollmann, Ulrich

    2004-09-01

    Specific external control of chemical reaction systems and both dynamic control and signal processing as central functions in biochemical reaction systems are important issues of modern nonlinear science. For example nonlinear input-output behavior and its regulation are crucial for the maintainance of the life process that requires extensive communication between cells and their environment. An important question is how the dynamical behavior of biochemical systems is controlled and how they process information transmitted by incoming signals. But also from a general point of view external forcing of complex chemical reaction processes is important in many application areas ranging from chemical engineering to biomedicine. In order to study such control issues numerically, here, we choose a well characterized chemical system, the CO oxidation on Pt(110), which is interesting per se as an externally forced chemical oscillator model. We show numerically that tuning of temporal self-organization by input signals in this simple nonlinear chemical reaction exhibiting oscillatory behavior can in principle be exploited for both specific external control of dynamical system behavior and processing of complex information.

  8. Robust fusion-based processing for military polarimetric imaging systems

    NASA Astrophysics Data System (ADS)

    Hickman, Duncan L.; Smith, Moira I.; Kim, Kyung Su; Choi, Hyun-Jin

    2017-05-01

    Polarisation information within a scene can be exploited in military systems to give enhanced automatic target detection and recognition (ATD/R) performance. However, the performance gain achieved is highly dependent on factors such as the geometry, viewing conditions, and the surface finish of the target. Such performance sensitivities are highly undesirable in many tactical military systems where operational conditions can vary significantly and rapidly during a mission. Within this paper, a range of processing architectures and fusion methods is considered in terms of their practical viability and operational robustness for systems requiring ATD/R. It is shown that polarisation information can give useful performance gains but, to retained system robustness, the introduction of polarimetric processing should be done in such a way as to not compromise other discriminatory scene information in the spectral and spatial domains. The analysis concludes that polarimetric data can be effectively integrated with conventional intensity-based ATD/R by either adapting the ATD/R processing function based on the scene polarisation or else by detection-level fusion. Both of these approaches avoid the introduction of processing bottlenecks and limit the impact of processing on system latency.

  9. Design and fabrication of a glovebox for the Plasma Hearth Process radioactive bench-scale system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahlquist, D.R.

    This paper presents some of the design considerations and fabrication techniques for building a glovebox for the Plasma Hearth Process (PHP) radioactive bench-scale system. The PHP radioactive bench-scale system uses a plasma torch to process a variety of radioactive materials into a final vitrified waste form. The processed waste will contain plutonium and trace amounts of other radioactive materials. The glovebox used in this system is located directly below the plasma chamber and is called the Hearth Handling Enclosure (HHE). The HHE is designed to maintain a confinement boundary between the processed waste and the operator. Operations that take placemore » inside the HHE include raising and lowering the hearth using a hydraulic lift table, transporting the hearth within the HHE using an overhead monorail and hoist system, sampling and disassembly of the processed waste and hearth, weighing the hearth, rebuilding a hearth, and sampling HEPA filters. The PHP radioactive bench-scale system is located at the TREAT facility at Argonne National Laboratory-West in Idaho Falls, Idaho.« less

  10. AOIPS - An interactive image processing system. [Atmospheric and Oceanic Information Processing System

    NASA Technical Reports Server (NTRS)

    Bracken, P. A.; Dalton, J. T.; Quann, J. J.; Billingsley, J. B.

    1978-01-01

    The Atmospheric and Oceanographic Information Processing System (AOIPS) was developed to help applications investigators perform required interactive image data analysis rapidly and to eliminate the inefficiencies and problems associated with batch operation. This paper describes the configuration and processing capabilities of AOIPS and presents unique subsystems for displaying, analyzing, storing, and manipulating digital image data. Applications of AOIPS to research investigations in meteorology and earth resources are featured.

  11. Image-Processing Software For A Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

    1992-01-01

    Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

  12. Low-temperature plasma technology as part of a closed-loop resource management system

    NASA Technical Reports Server (NTRS)

    Hetland, Melanie D.; Rindt, John R.; Jones, Frank A.; Sauer, Randal S.

    1990-01-01

    The results of this testing indicate that the agitated low-temperature plasma reactor system successfully converted carbon, hydrogen, and nitrogen into gaseous products at residence times that were about ten times shorter than those achieved by stationary processing. The inorganic matrix present was virtually unchanged by the processing technique. It was concluded that this processing technique is feasible for use as part of a close-looped processing resource management system.

  13. System Engineering Processes at Kennedy Space Center for Development of the SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric J.

    2012-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems developed at the Kennedy Space Center Engineering Directorate follow a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Paper describes this process and gives an example of where the process has been applied.

  14. Decide, design, and dewater de waste: A blueprint from Fitzpatrick

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robert, D.E.

    1994-04-01

    Using a different process to clean concentrated waste tanks at the James A. FitzPatrick nuclear power plant in New York saved nearly half million dollars. The plan essentially allowed processing concentrator bottoms as waste sludge (solidification versus dewatering) that could still meet burial ground requirements. The process reduced the volume from 802.2 to 55 cubic feet. This resin throwaway system eliminated chemicals in the radwaste systems and was designed to ease pressure on the pradwaste processing system, reduce waste and improve plant chemistry. This article discusses general aspects of the process.

  15. Generic Health Management: A System Engineering Process Handbook Overview and Process

    NASA Technical Reports Server (NTRS)

    Wilson, Moses Lee; Spruill, Jim; Hong, Yin Paw

    1995-01-01

    Health Management, a System Engineering Process, is one of those processes-techniques-and-technologies used to define, design, analyze, build, verify, and operate a system from the viewpoint of preventing, or minimizing, the effects of failure or degradation. It supports all ground and flight elements during manufacturing, refurbishment, integration, and operation through combined use of hardware, software, and personnel. This document will integrate Health Management Processes (six phases) into five phases in such a manner that it is never a stand alone task/effort which separately defines independent work functions.

  16. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  17. Oxygen Compatibility Assessment of Components and Systems

    NASA Technical Reports Server (NTRS)

    Stoltzfus, Joel; Sparks, Kyle

    2010-01-01

    Fire hazards are inherent in oxygen systems and a storied history of fires in rocket engine propulsion components exists. To detect and mitigate these fire hazards requires careful, detailed, and thorough analyses applied during the design process. The oxygen compatibility assessment (OCA) process designed by NASA Johnson Space Center (JSC) White Sands Test Facility (WSTF) can be used to determine the presence of fire hazards in oxygen systems and the likelihood of a fire. This process may be used as both a design guide and during the approval process to ensure proper design features and material selection. The procedure for performing an OCA is a structured step-by-step process to determine the most severe operating conditions; assess the flammability of the system materials at the use conditions; evaluate the presence and efficacy of ignition mechanisms; assess the potential for a fire to breach the system; and determine the reaction effect (the potential loss of life, mission, and system functionality as the result of a fire). This process should be performed for each component in a system. The results of each component assessment, and the overall system assessment, should be recorded in a report that can be used in the short term to communicate hazards and their mitigation and to aid in system/component development and, in the long term, to solve anomalies that occur during engine testing and operation.

  18. An Attachable Electromagnetic Energy Harvester Driven Wireless Sensing System Demonstrating Milling-Processes and Cutter-Wear/Breakage-Condition Monitoring.

    PubMed

    Chung, Tien-Kan; Yeh, Po-Chen; Lee, Hao; Lin, Cheng-Mao; Tseng, Chia-Yung; Lo, Wen-Tuan; Wang, Chieh-Min; Wang, Wen-Chin; Tu, Chi-Jen; Tasi, Pei-Yuan; Chang, Jui-Wen

    2016-02-23

    An attachable electromagnetic-energy-harvester driven wireless vibration-sensing system for monitoring milling-processes and cutter-wear/breakage-conditions is demonstrated. The system includes an electromagnetic energy harvester, three single-axis Micro Electro-Mechanical Systems (MEMS) accelerometers, a wireless chip module, and corresponding circuits. The harvester consisting of magnets with a coil uses electromagnetic induction to harness mechanical energy produced by the rotating spindle in milling processes and consequently convert the harnessed energy to electrical output. The electrical output is rectified by the rectification circuit to power the accelerometers and wireless chip module. The harvester, circuits, accelerometer, and wireless chip are integrated as an energy-harvester driven wireless vibration-sensing system. Therefore, this completes a self-powered wireless vibration sensing system. For system testing, a numerical-controlled machining tool with various milling processes is used. According to the test results, the system is fully self-powered and able to successfully sense vibration in the milling processes. Furthermore, by analyzing the vibration signals (i.e., through analyzing the electrical outputs of the accelerometers), criteria are successfully established for the system for real-time accurate simulations of the milling-processes and cutter-conditions (such as cutter-wear conditions and cutter-breaking occurrence). Due to these results, our approach can be applied to most milling and other machining machines in factories to realize more smart machining technologies.

  19. An Attachable Electromagnetic Energy Harvester Driven Wireless Sensing System Demonstrating Milling-Processes and Cutter-Wear/Breakage-Condition Monitoring

    PubMed Central

    Chung, Tien-Kan; Yeh, Po-Chen; Lee, Hao; Lin, Cheng-Mao; Tseng, Chia-Yung; Lo, Wen-Tuan; Wang, Chieh-Min; Wang, Wen-Chin; Tu, Chi-Jen; Tasi, Pei-Yuan; Chang, Jui-Wen

    2016-01-01

    An attachable electromagnetic-energy-harvester driven wireless vibration-sensing system for monitoring milling-processes and cutter-wear/breakage-conditions is demonstrated. The system includes an electromagnetic energy harvester, three single-axis Micro Electro-Mechanical Systems (MEMS) accelerometers, a wireless chip module, and corresponding circuits. The harvester consisting of magnets with a coil uses electromagnetic induction to harness mechanical energy produced by the rotating spindle in milling processes and consequently convert the harnessed energy to electrical output. The electrical output is rectified by the rectification circuit to power the accelerometers and wireless chip module. The harvester, circuits, accelerometer, and wireless chip are integrated as an energy-harvester driven wireless vibration-sensing system. Therefore, this completes a self-powered wireless vibration sensing system. For system testing, a numerical-controlled machining tool with various milling processes is used. According to the test results, the system is fully self-powered and able to successfully sense vibration in the milling processes. Furthermore, by analyzing the vibration signals (i.e., through analyzing the electrical outputs of the accelerometers), criteria are successfully established for the system for real-time accurate simulations of the milling-processes and cutter-conditions (such as cutter-wear conditions and cutter-breaking occurrence). Due to these results, our approach can be applied to most milling and other machining machines in factories to realize more smart machining technologies. PMID:26907297

  20. Simple, Scalable, Script-Based Science Processor (S4P)

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Vollmer, Bruce; Berrick, Stephen; Mack, Robert; Pham, Long; Zhou, Bryan; Wharton, Stephen W. (Technical Monitor)

    2001-01-01

    The development and deployment of data processing systems to process Earth Observing System (EOS) data has proven to be costly and prone to technical and schedule risk. Integration of science algorithms into a robust operational system has been difficult. The core processing system, based on commercial tools, has demonstrated limitations at the rates needed to produce the several terabytes per day for EOS, primarily due to job management overhead. This has motivated an evolution in the EOS Data Information System toward a more distributed one incorporating Science Investigator-led Processing Systems (SIPS). As part of this evolution, the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC) has developed a simplified processing system to accommodate the increased load expected with the advent of reprocessing and launch of a second satellite. This system, the Simple, Scalable, Script-based Science Processor (S42) may also serve as a resource for future SIPS. The current EOSDIS Core System was designed to be general, resulting in a large, complex mix of commercial and custom software. In contrast, many simpler systems, such as the EROS Data Center AVHRR IKM system, rely on a simple directory structure to drive processing, with directories representing different stages of production. The system passes input data to a directory, and the output data is placed in a "downstream" directory. The GES DAAC's Simple Scalable Script-based Science Processing System is based on the latter concept, but with modifications to allow varied science algorithms and improve portability. It uses a factory assembly-line paradigm: when work orders arrive at a station, an executable is run, and output work orders are sent to downstream stations. The stations are implemented as UNIX directories, while work orders are simple ASCII files. The core S4P infrastructure consists of a Perl program called stationmaster, which detects newly arrived work orders and forks a job to run the appropriate executable (registered in a configuration file for that station). Although S4P is written in Perl, the executables associated with a station can be any program that can be run from the command line, i.e., non-interactively. An S4P instance is typically monitored using a simple Graphical User Interface. However, the reliance of S4P on UNIX files and directories also allows visibility into the state of stations and jobs using standard operating system commands, permitting remote monitor/control over low-bandwidth connections. S4P is being used as the foundation for several small- to medium-size systems for data mining, on-demand subsetting, processing of direct broadcast Moderate Resolution Imaging Spectroradiometer (MODIS) data, and Quick-Response MODIS processing. It has also been used to implement a large-scale system to process MODIS Level 1 and Level 2 Standard Products, which will ultimately process close to 2 TB/day.

  1. 40 CFR 68.65 - Process safety information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to the technology of the process, and information pertaining to the equipment in the process. (b...) Information pertaining to the technology of the process. (1) Information concerning the technology of the...) Electrical classification; (iv) Relief system design and design basis; (v) Ventilation system design; (vi...

  2. 40 CFR 68.65 - Process safety information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to the technology of the process, and information pertaining to the equipment in the process. (b...) Information pertaining to the technology of the process. (1) Information concerning the technology of the...) Electrical classification; (iv) Relief system design and design basis; (v) Ventilation system design; (vi...

  3. 40 CFR 68.65 - Process safety information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to the technology of the process, and information pertaining to the equipment in the process. (b...) Information pertaining to the technology of the process. (1) Information concerning the technology of the...) Electrical classification; (iv) Relief system design and design basis; (v) Ventilation system design; (vi...

  4. Simulating the decentralized processes of the human immune system in a virtual anatomy model.

    PubMed

    Sarpe, Vladimir; Jacob, Christian

    2013-01-01

    Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.

  5. Effective Application of a Quality System in the Donation Process at Hospital Level.

    PubMed

    Trujnara, M; Czerwiński, J; Osadzińska, J

    2016-06-01

    This article describes the application of a quality system at the hospital level at the Multidisciplinary Hospital in Warsaw-Międzylesie in Poland. A quality system of hospital procedures (in accordance with the ISO system 9001:2008) regarding the donation process, from the identification of a possible donor to the retrieval of organs, was applied there in 2014. Seven independent documents about hospital procedures, were designed to cover the entire process of donation. The number of donors identified increased after the application of the quality system. The reason for this increase is, above all, the cooperation of the well-trained team of specialists who have been engaged in the process of donation for many years, but formal procedures certainly organize the process and make it easier. Copyright © 2016. Published by Elsevier Inc.

  6. Data processing and optimization system to study prospective interstate power interconnections

    NASA Astrophysics Data System (ADS)

    Podkovalnikov, Sergei; Trofimov, Ivan; Trofimov, Leonid

    2018-01-01

    The paper presents Data processing and optimization system for studying and making rational decisions on the formation of interstate electric power interconnections, with aim to increasing effectiveness of their functioning and expansion. The technologies for building and integrating a Data processing and optimization system including an object-oriented database and a predictive mathematical model for optimizing the expansion of electric power systems ORIRES, are described. The technology of collection and pre-processing of non-structured data collected from various sources and its loading to the object-oriented database, as well as processing and presentation of information in the GIS system are described. One of the approaches of graphical visualization of the results of optimization model is considered on the example of calculating the option for expansion of the South Korean electric power grid.

  7. Process for Selecting System Level Assessments for Human System Technologies

    NASA Technical Reports Server (NTRS)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  8. Analysis of dynamic system response to product random processes

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1978-01-01

    The response of dynamic systems to the product of two independent Gaussian random processes is developed by use of the Fokker-Planck and associated moment equations. The development is applied to the amplitude modulated process which is used to model atmospheric turbulence in aeronautical applications. The exact solution for the system response is compared with the solution obtained by the quasi-steady approximation which omits the dynamic properties of the random amplitude modulation. The quasi-steady approximation is valid as a limiting case of the exact solution for the dynamic response of linear systems to amplitude modulated processes. In the nonlimiting case the quasi-steady approximation can be invalid for dynamic systems with low damping.

  9. Integrated Dynamic Process Planning and Scheduling in Flexible Manufacturing Systems via Autonomous Agents

    NASA Astrophysics Data System (ADS)

    Nejad, Hossein Tehrani Nik; Sugimura, Nobuhiro; Iwamura, Koji; Tanimizu, Yoshitaka

    Process planning and scheduling are important manufacturing planning activities which deal with resource utilization and time span of manufacturing operations. The process plans and the schedules generated in the planning phase shall be modified in the execution phase due to the disturbances in the manufacturing systems. This paper deals with a multi-agent architecture of an integrated and dynamic system for process planning and scheduling for multi jobs. A negotiation protocol is discussed, in this paper, to generate the process plans and the schedules of the manufacturing resources and the individual jobs, dynamically and incrementally, based on the alternative manufacturing processes. The alternative manufacturing processes are presented by the process plan networks discussed in the previous paper, and the suitable process plans and schedules are searched and generated to cope with both the dynamic status and the disturbances of the manufacturing systems. We initiatively combine the heuristic search algorithms of the process plan networks with the negotiation protocols, in order to generate suitable process plans and schedules in the dynamic manufacturing environment. A simulation software has been developed to carry out case studies, aimed at verifying the performance of the proposed multi-agent architecture.

  10. The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox

    NASA Astrophysics Data System (ADS)

    Harris, A. T., III; Goodman, J.; Justice, B.

    2014-12-01

    As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.

  11. Fine grained event processing on HPCs with the ATLAS Yoda system

    NASA Astrophysics Data System (ADS)

    Calafiura, Paolo; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; Van Gemmeren, Peter; Wenaus, Torre

    2015-12-01

    High performance computing facilities present unique challenges and opportunities for HEP event processing. The massive scale of many HPC systems means that fractionally small utilization can yield large returns in processing throughput. Parallel applications which can dynamically and efficiently fill any scheduling opportunities the resource presents benefit both the facility (maximal utilization) and the (compute-limited) science. The ATLAS Yoda system provides this capability to HEP-like event processing applications by implementing event-level processing in an MPI-based master-client model that integrates seamlessly with the more broadly scoped ATLAS Event Service. Fine grained, event level work assignments are intelligently dispatched to parallel workers to sustain full utilization on all cores, with outputs streamed off to destination object stores in near real time with similarly fine granularity, such that processing can proceed until termination with full utilization. The system offers the efficiency and scheduling flexibility of preemption without requiring the application actually support or employ check-pointing. We will present the new Yoda system, its motivations, architecture, implementation, and applications in ATLAS data processing at several US HPC centers.

  12. An Analysis of the Air Force Government Operated Civil Engineering Supply Store Logistic System: How Can It Be Improved?

    DTIC Science & Technology

    1990-09-01

    6 Logistics Systems ............ 7 GOCESS Operation . . . . . . . ..... 9 Work Order Processing . . . . ... 12 Job Order Processing . . . . . . . . . . 14...orders and job orders to the Material Control Section will be discussed separately. Work Order Processing . Figure 2 illustrates typical WO processing...logistics function. The JO processing is similar. Job Order Processing . Figure 3 illustrates typical JO processing in a GOCESS operation. As with WOs, this

  13. Data processing system for the Sneg-2MP experiment

    NASA Technical Reports Server (NTRS)

    Gavrilova, Y. A.

    1980-01-01

    The data processing system for scientific experiments on stations of the "Prognoz" type provides for the processing sequence to be broken down into a number of consecutive stages: preliminary processing, primary processing, secondary processing. The tasks of each data processing stage are examined for an experiment designed to study gamma flashes of galactic origin and solar flares lasting from several minutes to seconds in the 20 kev to 1000 kev energy range.

  14. Parallel Algorithm for GPU Processing; for use in High Speed Machine Vision Sensing of Cotton Lint Trash.

    PubMed

    Pelletier, Mathew G

    2008-02-08

    One of the main hurdles standing in the way of optimal cleaning of cotton lint isthe lack of sensing systems that can react fast enough to provide the control system withreal-time information as to the level of trash contamination of the cotton lint. This researchexamines the use of programmable graphic processing units (GPU) as an alternative to thePC's traditional use of the central processing unit (CPU). The use of the GPU, as analternative computation platform, allowed for the machine vision system to gain asignificant improvement in processing time. By improving the processing time, thisresearch seeks to address the lack of availability of rapid trash sensing systems and thusalleviate a situation in which the current systems view the cotton lint either well before, orafter, the cotton is cleaned. This extended lag/lead time that is currently imposed on thecotton trash cleaning control systems, is what is responsible for system operators utilizing avery large dead-band safety buffer in order to ensure that the cotton lint is not undercleaned.Unfortunately, the utilization of a large dead-band buffer results in the majority ofthe cotton lint being over-cleaned which in turn causes lint fiber-damage as well assignificant losses of the valuable lint due to the excessive use of cleaning machinery. Thisresearch estimates that upwards of a 30% reduction in lint loss could be gained through theuse of a tightly coupled trash sensor to the cleaning machinery control systems. Thisresearch seeks to improve processing times through the development of a new algorithm forcotton trash sensing that allows for implementation on a highly parallel architecture.Additionally, by moving the new parallel algorithm onto an alternative computing platform,the graphic processing unit "GPU", for processing of the cotton trash images, a speed up ofover 6.5 times, over optimized code running on the PC's central processing unit "CPU", wasgained. The new parallel algorithm operating on the GPU was able to process a 1024x1024image in less than 17ms. At this improved speed, the image processing system's performance should now be sufficient to provide a system that would be capable of realtimefeed-back control that is in tight cooperation with the cleaning equipment.

  15. Development and evaluation of low cost honey heating-cum-filtration system.

    PubMed

    Alam, Md Shafiq; Sharma, D K; Sehgal, V K; Arora, M; Bhatia, S

    2014-11-01

    A fully mechanized honey heating-cum-filtration system was designed, developed, fabricated and evaluated for its performance. The system comprised of two sections; the top heating section and the lower filtering section. The developed system was evaluated for its performance at different process conditions (25 kg and 50 kg capacity using processing condition: 50 °C heating temperature and 60 °C heating temperature with 20 and 40 min holding time, respectively) and it was found that the total time required for heating, holding and filtration of honey was 108 and 142 min for 25 kg and 50 kg capacity of machine, respectively, irrespective of the processing conditions. The optimum capacity of the system was found to be 50 kg and it involved an investment of Rs 40,000 for its fabrication. The honey filtered through the developed filtration system was compared with the honey filtered in a high cost honey processing plant and raw honey for its microbial and biochemical (reducing sugars (%), moisture, acidity and pH) quality attributes. It was observed that the process of filtering through the developed unit resulted in reduction of microbes. The microbiological quality of honey filtered through the developed filtration system was better than that of raw honey and commercially processed honey. The treatment conditions found best in context of microbiological counts were 60 °C temperature for 20 min. There was 1.97 fold reductions in the plate count and 2.14 reductions in the fungal count of honey processed through the developed filtration system as compared to the raw honey. No coliforms were found in the processed honey. Honey processed through developed unit witnessed less moisture content, acidity and more reducing sugars as compared to raw honey, whereas its quality was comparable to the commercially processed honey.

  16. 46 CFR 154.500 - Cargo and process piping standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Equipment Cargo and Process Piping Systems § 154.500 Cargo and process piping standards. The cargo liquid and vapor piping and process piping systems must meet the requirements in §§ 154.503 through 154.562... 46 Shipping 5 2010-10-01 2010-10-01 false Cargo and process piping standards. 154.500 Section 154...

  17. A Guide for Designing and Implementing a Case Processing System for Child Support Enforcement.

    ERIC Educational Resources Information Center

    Office of Child Support Enforcement (DHHS), Washington, DC.

    This document was written to provide state and local child support offices with help in refining their case processing systems. The guide is divided into five chapters. Chapter I, Case Processing Overview, defines case processing and the case processing functions and provides a narrative description and graphic illustration of the processes…

  18. Remediating ethylbenzene-contaminated clayey soil by a surfactant-aided electrokinetic (SAEK) process.

    PubMed

    Yuan, Ching; Weng, Chih-Huang

    2004-10-01

    The objectives of this research are to investigate the remediation efficiency and electrokinetic behavior of ethylbenzene-contaminated clay by a surfactant-aided electrokinetic (SAEK) process under a potential gradient of 2 Vcm(-1). Experimental results indicated that the type of processing fluids played a key role in determining the removal performance of ethylbenzene from clay in the SAEK process. A mixed surfactant system consisted of 0.5% SDS and 2.0% PANNOX 110 showed the best performance of ethylbenzene removed in the SAEK system. The removal efficiency of ethylbenzene was determined to be 63-98% in SAEK system while only 40% was achieved in an electrokinetic system with tap water as processing fluid. It was found that ethylbenzene was accumulated in the vicinity of anode in an electrokinetic system with tap water as processing fluid. However, the concentration front of ethylbenzene was shifted toward cathode in the SAEK system. The electroosmotic permeability and power consumption were 0.17 x 10(-6)-3.01 x 10(-6) cm(2)V(-1)s(-1) and 52-123 kW h m(-3), respectively. The cost, including the expense of energy and surfactants, was estimated to be 5.15-12.65 USD m(-3) for SAEK systems, which was 2.0-4.9 times greater than that in the system of electrokinetic alone (2.6 USD m(-3)). Nevertheless, by taking the remediation efficiency of ethylbenzene and the energy expenditure into account for the overall process performance evaluation, the system SAEK was still a cost-effective alternative treatment method.

  19. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  20. Advanced information processing system - Status report. [for fault tolerant and damage tolerant data processing for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Lala, J.

    1986-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles. The AIPS architecture also has attributes to enhance system effectiveness such as graceful degradation, growth and change tolerance, integrability, etc. Two key building blocks being developed by the AIPS program are a fault and damage tolerant processor and communication network. A proof-of-concept system is now being built and will be tested to demonstrate the validity and performance of the AIPS concepts.

  1. Aminosilicone solvent recovery methods and systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spiry, Irina Pavlovna; Perry, Robert James; Wood, Benjamin Rue

    The present invention is directed to aminosilicone solvent recovery methods and systems. The methods and systems disclosed herein may be used to recover aminosilicone solvent from a carbon dioxide containing vapor stream, for example, a vapor stream that leaves an aminosilicone solvent desorber apparatus. The methods and systems of the invention utilize a first condensation process at a temperature from about 80.degree. C. to about 150.degree. C. and a second condensation process at a temperature from about 5.degree. C. to about 75.degree. C. The first condensation process yields recovered aminosilicone solvent. The second condensation process yields water.

  2. Use of a multimission system for cost effective support of planetary science data processing

    NASA Technical Reports Server (NTRS)

    Green, William B.

    1994-01-01

    JPL's Multimission Operations Systems Office (MOSO) provides a multimission facility at JPL for processing science instrument data from NASA's planetary missions. This facility, the Multimission Image Processing System (MIPS), is developed and maintained by MOSO to meet requirements that span the NASA family of planetary missions. Although the word 'image' appears in the title, MIPS is used to process instrument data from a variety of science instruments. This paper describes the design of a new system architecture now being implemented within the MIPS to support future planetary mission activities at significantly reduced operations and maintenance cost.

  3. Nonterrestrial material processing and manufacturing of large space systems

    NASA Technical Reports Server (NTRS)

    Von Tiesenhausen, G.

    1979-01-01

    Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.

  4. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  5. Buried object detection in GPR images

    DOEpatents

    Paglieroni, David W; Chambers, David H; Bond, Steven W; Beer, W. Reginald

    2014-04-29

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  6. Rethinking the Systems Engineering Process in Light of Design Thinking

    DTIC Science & Technology

    2016-04-30

    systems engineering process models (Blanchard & Fabrycky, 1990) and the majority of engineering design education (Dym et al., 2005). The waterfall model ...Engineering Career Competency Model Clifford Whitcomb, Systems Engineering Professor, NPS Corina White, Systems Engineering Research Associate, NPS...Postgraduate School (NPS) in Monterey, CA. He teaches and conducts research in the design of enterprise systems, systems modeling , and system

  7. Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.

    NASA Astrophysics Data System (ADS)

    Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.

    Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.

  8. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  9. An intelligent advisory system for pre-launch processing

    NASA Technical Reports Server (NTRS)

    Engrand, Peter A.; Mitchell, Tami

    1991-01-01

    The shuttle system of interest in this paper is the shuttle's data processing system (DPS). The DPS is composed of the following: (1) general purpose computers (GPC); (2) a multifunction CRT display system (MCDS); (3) mass memory units (MMU); and (4) a multiplexer/demultiplexer (MDM) and related software. In order to ensure the correct functioning of shuttle systems, some level of automatic error detection has been incorporated into all shuttle systems. For the DPS, error detection equipment has been incorporated into all of its subsystems. The automated diagnostic system, (MCDS) diagnostic tool, that aids in a more efficient processing of the DPS is described.

  10. Data Entities and Information System Matrix for Integrated Agriculture Information System (IAIS)

    NASA Astrophysics Data System (ADS)

    Budi Santoso, Halim; Delima, Rosa

    2018-03-01

    Integrated Agriculture Information System is a system that is developed to process data, information, and knowledge in Agriculture sector. Integrated Agriculture Information System brings valuable information for farmers: (1) Fertilizer price; (2) Agriculture technique and practise; (3) Pest management; (4) Cultivation; (5) Irrigation; (6) Post harvest processing; (7) Innovation in agriculture processing. Integrated Agriculture Information System contains 9 subsystems. To bring an integrated information to the user and stakeholder, it needs an integrated database approach. Thus, researchers describes data entity and its matrix relate to subsystem in Integrated Agriculture Information System (IAIS). As a result, there are 47 data entities as entities in single and integrated database.

  11. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  12. Real-time optical fiber digital speckle pattern interferometry for industrial applications

    NASA Astrophysics Data System (ADS)

    Chan, Robert K.; Cheung, Y. M.; Lo, C. H.; Tam, T. K.

    1997-03-01

    There is current interest, especially in the industrial sector, to use the digital speckle pattern interferometry (DSPI) technique to measure surface stress. Indeed, many publications in the subject are evident of the growing interests in the field. However, to bring the technology to industrial use requires the integration of several emerging technologies, viz. optics, feedback control, electronics, imaging processing and digital signal processing. Due to the highly interdisciplinary nature of the technique, successful implementation and development require expertise in all of the fields. At Baptist University, under the funding of a major industrial grant, we are developing the technology for the industrial sector. Our system fully exploits optical fibers and diode lasers in the design to enable practical and rugged systems suited for industrial applications. Besides the development in optics, we have broken away from the reliance of a microcomputer PC platform for both image capture and processing, and have developed a digital signal processing array system that can handle simultaneous and independent image capture/processing with feedback control. The system, named CASPA for 'cascadable architecture signal processing array,' is a third generation development system that utilizes up to 7 digital signal processors has proved to be a very powerful system. With our CASPA we are now in a better position to developing novel optical measurement systems for industrial application that may require different measurement systems to operate concurrently and requiring information exchange between the systems. Applications in mind such as simultaneous in-plane and out-of-plane DSPI image capture/process, vibrational analysis with interactive DSPI and phase shifting control of optical systems are a few good examples of the potentials.

  13. Optimization of insect cell based protein production processes - online monitoring, expression systems, scale up.

    PubMed

    Druzinec, Damir; Salzig, Denise; Brix, Alexander; Kraume, Matthias; Vilcinskas, Andreas; Kollewe, Christian; Czermak, Peter

    2013-01-01

    Due to the increasing use of insect cell based expression systems in research and industrial recombinant protein production, the development of efficient and reproducible production processes remains a challenging task. In this context, the application of online monitoring techniques is intended to ensure high and reproducible product qualities already during the early phases of process development. In the following chapter, the most common transient and stable insect cell based expression systems are briefly introduced. Novel applications of insect cell based expression systems for the production of insect derived antimicrobial peptides/proteins (AMPs) are discussed using the example of G. mellonella derived gloverin. Suitable in situ sensor techniques for insect cell culture monitoring in disposable and common bioreactor systems are outlined with respect to optical and capacitive sensor concepts. Since scale up of production processes is one of the most critical steps in process development, a conclusive overview is given about scale up aspects for industrial insect cell culture processes.

  14. User's guide to image processing applications of the NOAA satellite HRPT/AVHRR data. Part 1: Introduction to the satellite system and its applications. Part 2: Processing and analysis of AVHRR imagery

    NASA Technical Reports Server (NTRS)

    Huh, Oscar Karl; Leibowitz, Scott G.; Dirosa, Donald; Hill, John M.

    1986-01-01

    The use of NOAA Advanced Very High Resolution Radar/High Resolution Picture Transmission (AVHRR/HRPT) imagery for earth resource applications is provided for the applications scientist for use within the various Earth science, resource, and agricultural disciplines. A guide to processing NOAA AVHRR data using the hardware and software systems integrated for this NASA project is provided. The processing steps from raw data on computer compatible tapes (1B data format) through usable qualitative and quantitative products for applications are given. The manual is divided into two parts. The first section describes the NOAA satellite system, its sensors, and the theoretical basis for using these data for environmental applications. Part 2 is a hands-on description of how to use a specific image processing system, the International Imaging Systems, Inc. (I2S) Model 75 Array Processor and S575 software, to process these data.

  15. An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chien, T. T.

    1972-01-01

    An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.

  16. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  17. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  18. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  19. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  20. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  1. Microchemical Systems for Fuel Processing and Conversion to Electrical Power

    DTIC Science & Technology

    2007-03-15

    Processing and Conversion to Electrical Power - Final Report 2 Table of Contents Table of Contents... Processing and Conversion to Electrical Power - Final Report 3 8.7 Development of Large Free-Standing Electrolyte-supported Micro Fuel Cell Membranes...84 MURI Microchemical Systems for Fuel Processing and

  2. 23 CFR 971.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS FOREST SERVICE... maintain the management systems and their associated databases; and (5) A process for data collection, processing, analysis, and updating for each management system. (c) All management systems will use databases...

  3. 23 CFR 970.204 - Management systems requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL LANDS HIGHWAYS NATIONAL PARK... the management systems and their associated databases; and (5) A process for data collection, processing, analysis and updating for each management system. (d) All management systems will use databases...

  4. "Chemical transformers" from nanoparticle ensembles operated with logic.

    PubMed

    Motornov, Mikhail; Zhou, Jian; Pita, Marcos; Gopishetty, Venkateshwarlu; Tokarev, Ihor; Katz, Evgeny; Minko, Sergiy

    2008-09-01

    The pH-responsive nanoparticles were coupled with information-processing enzyme-based systems to yield "smart" signal-responsive hybrid systems with built-in Boolean logic. The enzyme systems performed AND/OR logic operations, transducing biochemical input signals into reversible structural changes (signal-directed self-assembly) of the nanoparticle assemblies, thus resulting in the processing and amplification of the biochemical signals. The hybrid system mimics biological systems in effective processing of complex biochemical information, resulting in reversible changes of the self-assembled structures of the nanoparticles. The bioinspired approach to the nanostructured morphing materials could be used in future self-assembled molecular robotic systems.

  5. Multi-kilowatt modularized spacecraft power processing system development

    NASA Technical Reports Server (NTRS)

    Andrews, R. E.; Hayden, J. H.; Hedges, R. T.; Rehmann, D. W.

    1975-01-01

    A review of existing information pertaining to spacecraft power processing systems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processing systems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations.

  6. Support system, excavation arrangement, and process of supporting an object

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, Bill W.

    2017-08-01

    A support system, an excavation arrangement, and a process of supporting an object are disclosed. The support system includes a weight-bearing device and a camming mechanism positioned below the weight-bearing device. A downward force on the weight-bearing device at least partially secures the camming mechanism to opposing surfaces. The excavation arrangement includes a borehole, a support system positioned within and secured to the borehole, and an object positioned on and supported by the support system. The process includes positioning and securing the support system and positioning the object on the weight-bearing device.

  7. Multiple-state quantum Otto engine, 1D box system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latifah, E., E-mail: enylatifah@um.ac.id; Purwanto, A.

    2014-03-24

    Quantum heat engines produce work using quantum matter as their working substance. We studied adiabatic and isochoric processes and defined the general force according to quantum system. The processes and general force are used to evaluate a quantum Otto engine based on multiple-state of one dimensional box system and calculate the efficiency. As a result, the efficiency depends on the ratio of initial and final width of system under adiabatic processes.

  8. Enhanced Training by a Systemic Governance of Force Capabilities, Tasks, and Processes

    DTIC Science & Technology

    2013-06-01

    18th ICCRTS “C2 in Underdeveloped, Degraded and Denied Operational Environments” Enhanced Training by a Systemic Governance of Force Capabilities...TITLE AND SUBTITLE Enhanced Training by a Systemic Governance of Force Capabilities, Tasks, and Processes 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...assess, evaluate and accredit the Swedish forces. This paper presents a Systemic Governance of Capabilities, Tasks, and Processes applied to the

  9. Integration of laboratory and process testing data

    PubMed Central

    Tyszkiewicz, Michael

    1995-01-01

    The author describes ACS Inc.'s Pro-LIMS system which integrates laboratory and process procedures. The system has been shown to be an important toolfor quality assurance in the process manufacturing industry. PMID:18924782

  10. Implementation of a Web-Based Collaborative Process Planning System

    NASA Astrophysics Data System (ADS)

    Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi

    Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.

  11. The Design, Development and Testing of a Multi-process Real-time Software System

    DTIC Science & Technology

    2007-03-01

    programming large systems stems from the complexity of dealing with many different details at one time. A sound engineering approach is to break...controls and 3) is portable to other OS platforms such as Microsoft Windows. Next, to reduce the complexity of the programming tasks, the system...processes depending on how often the process has to check to see if common data was modified. A good method for one process to quickly notify another

  12. System Engineering Processes at Kennedy Space Center for Development of SLS and Orion Launch Systems

    NASA Technical Reports Server (NTRS)

    Schafer, Eric; Stambolian, Damon; Henderson, Gena

    2013-01-01

    There are over 40 subsystems being developed for the future SLS and Orion Launch Systems at Kennedy Space Center. These subsystems are developed at the Kennedy Space Center Engineering Directorate. The Engineering Directorate at Kennedy Space Center follows a comprehensive design process which requires several different product deliverables during each phase of each of the subsystems. This Presentation describes this process with examples of where the process has been applied.

  13. Learning Is the Journey: From Process Reengineering to Systemic Customer-Service Design at the United States Department of Veterans Affairs, Veterans Benefits Administration

    DTIC Science & Technology

    2013-05-23

    This monograph borrows from multiple disciplines to argue for an organizational shift from process reengineering to system design to improve...government customer-service delivery. Specifically, the monograph proposes a transformation in claims processing within the Veterans Benefits Administration...required. The proposed system design is an attempt to place the disability claims process within a larger environment encompassing multiple dimensions of customers.

  14. DoD Lead System Integrator (LSI) Transformation - Creating a Model Based Acquisition Framework (MBAF)

    DTIC Science & Technology

    2014-04-30

    cost to acquire systems as design maturity could be verified incrementally as the system was developed vice waiting for specific large “ big bang ...Framework (MBAF) be applied to simulate or optimize process variations on programs? LSI Roles and Responsibilities A review of the roles and...the model/process optimization process. It is the current intent that NAVAIR will use the model to run simulations on process changes in an attempt to

  15. Parallel asynchronous systems and image processing algorithms

    NASA Technical Reports Server (NTRS)

    Coon, D. D.; Perera, A. G. U.

    1989-01-01

    A new hardware approach to implementation of image processing algorithms is described. The approach is based on silicon devices which would permit an independent analog processing channel to be dedicated to evey pixel. A laminar architecture consisting of a stack of planar arrays of the device would form a two-dimensional array processor with a 2-D array of inputs located directly behind a focal plane detector array. A 2-D image data stream would propagate in neuronlike asynchronous pulse coded form through the laminar processor. Such systems would integrate image acquisition and image processing. Acquisition and processing would be performed concurrently as in natural vision systems. The research is aimed at implementation of algorithms, such as the intensity dependent summation algorithm and pyramid processing structures, which are motivated by the operation of natural vision systems. Implementation of natural vision algorithms would benefit from the use of neuronlike information coding and the laminar, 2-D parallel, vision system type architecture. Besides providing a neural network framework for implementation of natural vision algorithms, a 2-D parallel approach could eliminate the serial bottleneck of conventional processing systems. Conversion to serial format would occur only after raw intensity data has been substantially processed. An interesting challenge arises from the fact that the mathematical formulation of natural vision algorithms does not specify the means of implementation, so that hardware implementation poses intriguing questions involving vision science.

  16. Dual processing model of medical decision-making.

    PubMed

    Djulbegovic, Benjamin; Hozo, Iztok; Beckstead, Jason; Tsalatsanis, Athanasios; Pauker, Stephen G

    2012-09-03

    Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. We show that physician's beliefs about whether to treat at higher (lower) probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker's threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. We have developed the first dual processing model of medical decision-making that has potential to enrich the current medical decision-making field, which is still to the large extent dominated by expected utility theory. The model also provides a platform for reconciling two groups of competing dual processing theories (parallel competitive with default-interventionalist theories).

  17. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  18. 42 CFR 433.112 - FFP for design, development, installation or enhancement of mechanized claims processing and...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... enhancement of mechanized claims processing and information retrieval systems. 433.112 Section 433.112 Public... processing and information retrieval systems. (a) FFP is available at the 90 percent rate in State... information retrieval system only if the APD is approved by CMS prior to the State's expenditure of funds for...

  19. 77 FR 38866 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ... supra note 6. System Processing Fee Under Section 4(b)(6) of Schedule A, FINRA currently charges an annual $30 system processing fee for each member's registered individuals. FINRA is proposing to increase the system processing fee to $45. This fee has not been increased since January 2000.\\11\\ Since 2000...

  20. Development of techniques for processing metal-metal oxide systems

    NASA Technical Reports Server (NTRS)

    Johnson, P. C.

    1976-01-01

    Techniques for producing model metal-metal oxide systems for the purpose of evaluating the results of processing such systems in the low-gravity environment afforded by a drop tower facility are described. Because of the lack of success in producing suitable materials samples and techniques for processing in the 3.5 seconds available, the program was discontinued.

  1. Proposed Computer System for Library Catalog Maintenance. Part II: System Design.

    ERIC Educational Resources Information Center

    Stein (Theodore) Co., New York, NY.

    The logic of the system presented in this report is divided into six parts for computer processing and manipulation. They are: (1) processing of Library of Congress copy, (2) editing of input into standard format, (3) processing of information into and out from the authority files, (4) creation of the catalog records, (5) production of the…

  2. Developing a Mobile Application "Educational Process Remote Management System" on the Android Operating System

    ERIC Educational Resources Information Center

    Abildinova, Gulmira M.; Alzhanov, Aitugan K.; Ospanova, Nazira N.; Taybaldieva, Zhymatay; Baigojanova, Dametken S.; Pashovkin, Nikita O.

    2016-01-01

    Nowadays, when there is a need to introduce various innovations into the educational process, most efforts are aimed at simplifying the learning process. To that end, electronic textbooks, testing systems and other software is being developed. Most of them are intended to run on personal computers with limited mobility. Smart education is…

  3. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  4. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  5. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  6. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  7. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  8. Validation, Edits, and Application Processing System Report: Phase I.

    ERIC Educational Resources Information Center

    Gray, Susan; And Others

    Findings of phase 1 of a study of the 1979-1980 Basic Educational Opportunity Grants validation, edits, and application processing system are presented. The study was designed to: assess the impact of the validation effort and processing system edits on the correct award of Basic Grants; and assess the characteristics of students most likely to…

  9. A system of automated processing of deep water hydrological information

    NASA Technical Reports Server (NTRS)

    Romantsov, V. A.; Dyubkin, I. A.; Klyukbin, L. N.

    1974-01-01

    An automated system for primary and scientific analysis of deep water hydrological information is presented. Primary processing of the data in this system is carried out on a drifting station, which also calculates the parameters of vertical stability of the sea layers, as well as their depths and altitudes. Methods of processing the raw data are described.

  10. A situation-response model for intelligent pilot aiding

    NASA Technical Reports Server (NTRS)

    Schudy, Robert; Corker, Kevin

    1987-01-01

    An intelligent pilot aiding system needs models of the pilot information processing to provide the computational basis for successful cooperation between the pilot and the aiding system. By combining artificial intelligence concepts with the human information processing model of Rasmussen, an abstraction hierarchy of states of knowledge, processing functions, and shortcuts are developed, which is useful for characterizing the information processing both of the pilot and of the aiding system. This approach is used in the conceptual design of a real time intelligent aiding system for flight crews of transport aircraft. One promising result was the tentative identification of a particular class of information processing shortcuts, from situation characterizations to appropriate responses, as the most important reliable pathway for dealing with complex time critical situations.

  11. Facilitating preemptive hardware system design using partial reconfiguration techniques.

    PubMed

    Dondo Gazzano, Julio; Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos

    2014-01-01

    In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration.

  12. Facilitating Preemptive Hardware System Design Using Partial Reconfiguration Techniques

    PubMed Central

    Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos

    2014-01-01

    In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration. PMID:24672292

  13. Employee Engagement Is Vital for the Successful Selection of a Total Laboratory Automation System.

    PubMed

    Yu, Hoi-Ying E; Wilkerson, Myra L

    2017-11-08

    To concretely outline a process for selecting a total laboratory automation system that connects clinical chemistry, hematology, and coagulation analyzers and to serve as a reference for other laboratories. In Phase I, a committee including the laboratory's directors and technologists conducted a review of 5 systems based on formal request for information process, site visits, and vendor presentations. We developed evaluation criteria and selected the 2 highest performing systems. In Phase II, we executed a detailed comparison of the 2 vendors based on cost, instrument layout, workflow design, and future potential. In addition to selecting a laboratory automation system, we used the process to ensure employee engagement in preparation for implementation. Selecting a total laboratory automation system is a complicated process. This paper provides practical guide in how a thorough selection process can be done with participation of key stakeholders. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  14. Attitude determination of a high altitude balloon system. Part 2: Development of the parameter determination process

    NASA Technical Reports Server (NTRS)

    Nigro, N. J.; Elkouh, A. F.

    1975-01-01

    The attitude of the balloon system is determined as a function of time if: (a) a method for simulating the motion of the system is available, and (b) the initial state is known. The initial state is obtained by fitting the system motion (as measured by sensors) to the corresponding output predicted by the mathematical model. In the case of the LACATE experiment the sensors consisted of three orthogonally oriented rate gyros and a magnetometer all mounted on the research platform. The initial state was obtained by fitting the angular velocity components measured with the gyros to the corresponding values obtained from the solution of the math model. A block diagram illustrating the attitude determination process employed for the LACATE experiment is shown. The process consists of three essential parts; a process for simulating the balloon system, an instrumentation system for measuring the output, and a parameter estimation process for systematically and efficiently solving the initial state. Results are presented and discussed.

  15. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  16. MIRADS-2 Implementation Manual

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The Marshall Information Retrieval and Display System (MIRADS) which is a data base management system designed to provide the user with a set of generalized file capabilities is presented. The system provides a wide variety of ways to process the contents of the data base and includes capabilities to search, sort, compute, update, and display the data. The process of creating, defining, and loading a data base is generally called the loading process. The steps in the loading process which includes (1) structuring, (2) creating, (3) defining, (4) and implementing the data base for use by MIRADS are defined. The execution of several computer programs is required to successfully complete all steps of the loading process. This library must be established as a cataloged mass storage file as the first step in MIRADS implementation. The procedure for establishing the MIRADS Library is given. The system is currently operational for the UNIVAC 1108 computer system utilizing the Executive Operating System. All procedures relate to the use of MIRADS on the U-1108 computer.

  17. A service based adaptive U-learning system using UX.

    PubMed

    Jeong, Hwa-Young; Yi, Gangman

    2014-01-01

    In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.

  18. A Service Based Adaptive U-Learning System Using UX

    PubMed Central

    Jeong, Hwa-Young

    2014-01-01

    In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832

  19. Grounding explanations in evolving, diagnostic situations

    NASA Technical Reports Server (NTRS)

    Johannesen, Leila J.; Cook, Richard I.; Woods, David D.

    1994-01-01

    Certain fields of practice involve the management and control of complex dynamic systems. These include flight deck operations in commercial aviation, control of space systems, anesthetic management during surgery or chemical or nuclear process control. Fault diagnosis of these dynamic systems generally must occur with the monitored process on-line and in conjunction with maintaining system integrity.This research seeks to understand in more detail what it means for an intelligent system to function cooperatively, or as a 'team player' in complex, dynamic environments. The approach taken was to study human practitioners engaged in the management of a complex, dynamic process: anesthesiologists during neurosurgical operations. The investigation focused on understanding how team members cooperate in management and fault diagnosis and comparing this interaction to the situation with an Artificial Intelligence(AI) system that provides diagnoses and explanations. Of particular concern was to study the ways in which practitioners support one another in keeping aware of relevant information concerning the state of the monitored process and of the problem solving process.

  20. Integration process of fermentation and liquid biphasic flotation for lipase separation from Burkholderia cepacia.

    PubMed

    Sankaran, Revathy; Show, Pau Loke; Lee, Sze Ying; Yap, Yee Jiun; Ling, Tau Chuan

    2018-02-01

    Liquid Biphasic Flotation (LBF) is an advanced recovery method that has been effectively applied for biomolecules extraction. The objective of this investigation is to incorporate the fermentation and extraction process of lipase from Burkholderia cepacia using flotation system. Initial study was conducted to compare the performance of bacteria growth and lipase production using flotation and shaker system. From the results obtained, bacteria shows quicker growth and high lipase yield via flotation system. Integration process for lipase separation was investigated and the result showed high efficiency reaching 92.29% and yield of 95.73%. Upscaling of the flotation system exhibited consistent result with the lab-scale which are 89.53% efficiency and 93.82% yield. The combination of upstream and downstream processes in a single system enables the acceleration of product formation, improves the product yield and facilitates downstream processing. This integration system demonstrated its potential for biomolecules fermentation and separation that possibly open new opportunities for industrial production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Scheduling algorithms for automatic control systems for technological processes

    NASA Astrophysics Data System (ADS)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  2. Cooperative storage of shared files in a parallel computing system with dynamic block size

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  3. Firmware Development Improves System Efficiency

    NASA Technical Reports Server (NTRS)

    Chern, E. James; Butler, David W.

    1993-01-01

    Most manufacturing processes require physical pointwise positioning of the components or tools from one location to another. Typical mechanical systems utilize either stop-and-go or fixed feed-rate procession to accomplish the task. The first approach achieves positional accuracy but prolongs overall time and increases wear on the mechanical system. The second approach sustains the throughput but compromises positional accuracy. A computer firmware approach has been developed to optimize this point wise mechanism by utilizing programmable interrupt controls to synchronize engineering processes 'on the fly'. This principle has been implemented in an eddy current imaging system to demonstrate the improvement. Software programs were developed that enable a mechanical controller card to transmit interrupts to a system controller as a trigger signal to initiate an eddy current data acquisition routine. The advantages are: (1) optimized manufacturing processes, (2) increased throughput of the system, (3) improved positional accuracy, and (4) reduced wear and tear on the mechanical system.

  4. A digital signal processing system for coherent laser radar

    NASA Technical Reports Server (NTRS)

    Hampton, Diana M.; Jones, William D.; Rothermel, Jeffry

    1991-01-01

    A data processing system for use with continuous-wave lidar is described in terms of its configuration and performance during the second survey mission of NASA'a Global Backscatter Experiment. The system is designed to estimate a complete lidar spectrum in real time, record the data from two lidars, and monitor variables related to the lidar operating environment. The PC-based system includes a transient capture board, a digital-signal processing (DSP) board, and a low-speed data-acquisition board. Both unprocessed and processed lidar spectrum data are monitored in real time, and the results are compared to those of a previous non-DSP-based system. Because the DSP-based system is digital it is slower than the surface-acoustic-wave signal processor and collects 2500 spectra/s. However, the DSP-based system provides complete data sets at two wavelengths from the continuous-wave lidars.

  5. Computer Sciences and Data Systems, volume 1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  6. 42 CFR 431.806 - State plan requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... processing assessment system. Except in a State that has an approved Medicaid Management Information System... Medicaid quality control claims processing assessment system that meets the requirements of §§ 431.830...

  7. A conceptual framework for intelligent real-time information processing

    NASA Technical Reports Server (NTRS)

    Schudy, Robert

    1987-01-01

    By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.

  8. Image processing and computer controls for video profile diagnostic system in the ground test accelerator (GTA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, R.M.; Zander, M.E.; Brown, S.K.

    1992-09-01

    This paper describes the application of video image processing to beam profile measurements on the Ground Test Accelerator (GTA). A diagnostic was needed to measure beam profiles in the intermediate matching section (IMS) between the radio-frequency quadrupole (RFQ) and the drift tube linac (DTL). Beam profiles are measured by injecting puffs of gas into the beam. The light emitted from the beam-gas interaction is captured and processed by a video image processing system, generating the beam profile data. A general purpose, modular and flexible video image processing system, imagetool, was used for the GTA image profile measurement. The development ofmore » both software and hardware for imagetool and its integration with the GTA control system (GTACS) will be discussed. The software includes specialized algorithms for analyzing data and calibrating the system. The underlying design philosophy of imagetool was tested by the experience of building and using the system, pointing the way for future improvements. The current status of the system will be illustrated by samples of experimental data.« less

  9. Image processing and computer controls for video profile diagnostic system in the ground test accelerator (GTA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, R.M.; Zander, M.E.; Brown, S.K.

    1992-01-01

    This paper describes the application of video image processing to beam profile measurements on the Ground Test Accelerator (GTA). A diagnostic was needed to measure beam profiles in the intermediate matching section (IMS) between the radio-frequency quadrupole (RFQ) and the drift tube linac (DTL). Beam profiles are measured by injecting puffs of gas into the beam. The light emitted from the beam-gas interaction is captured and processed by a video image processing system, generating the beam profile data. A general purpose, modular and flexible video image processing system, imagetool, was used for the GTA image profile measurement. The development ofmore » both software and hardware for imagetool and its integration with the GTA control system (GTACS) will be discussed. The software includes specialized algorithms for analyzing data and calibrating the system. The underlying design philosophy of imagetool was tested by the experience of building and using the system, pointing the way for future improvements. The current status of the system will be illustrated by samples of experimental data.« less

  10. A model framework to represent plant-physiology and rhizosphere processes in soil profile simulation models

    NASA Astrophysics Data System (ADS)

    Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.

    2013-12-01

    Plant roots play a crucial role in several key processes in soils. Besides their impact on biogeochemical cycles and processes, they also have an important influence on physical processes such as water flow and transport of dissolved substances in soils. Interaction between plant roots and soil processes takes place at different scales and ranges from the scale of an individual root and its directly surrounding soil or rhizosphere over the scale of a root system of an individual plant in a soil profile to the scale of vegetation patterns in landscapes. Simulation models that are used to predict water flow and solute transport in soil-plant systems mainly focus on the individual plant root system scale, parameterize single-root scale phenomena, and aggregate the root system scale to the vegetation scale. In this presentation, we will focus on the transition from the single root to the root system scale. Using high resolution non-invasive imaging techniques and methods, gradients in soil properties and states around roots and their difference from the bulk soil properties could be demonstrated. Recent developments in plant sciences provide new insights in the mechanisms that control water fluxes in plants and in the adaptation of root properties or root plasticity to changing soil conditions. However, since currently used approaches to simulate root water uptake neither resolve these small scale processes nor represent processes and controls within the root system, transferring this information to the whole soil-plant system scale is a challenge. Using a simulation model that describes flow and transport processes in the soil, resolves flow and transport towards individual roots, and describes flow and transport within the root system, such a transfer could be achieved. We present a few examples that illustrate: (i) the impact of changed rhizosphere hydraulic properties, (ii) the effect of root hydraulic properties and root system architecture, (iii) the regulation of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.

  11. A Search Algorithm for Generating Alternative Process Plans in Flexible Manufacturing System

    NASA Astrophysics Data System (ADS)

    Tehrani, Hossein; Sugimura, Nobuhiro; Tanimizu, Yoshitaka; Iwamura, Koji

    Capabilities and complexity of manufacturing systems are increasing and striving for an integrated manufacturing environment. Availability of alternative process plans is a key factor for integration of design, process planning and scheduling. This paper describes an algorithm for generation of alternative process plans by extending the existing framework of the process plan networks. A class diagram is introduced for generating process plans and process plan networks from the viewpoint of the integrated process planning and scheduling systems. An incomplete search algorithm is developed for generating and searching the process plan networks. The benefit of this algorithm is that the whole process plan network does not have to be generated before the search algorithm starts. This algorithm is applicable to large and enormous process plan networks and also to search wide areas of the network based on the user requirement. The algorithm can generate alternative process plans and to select a suitable one based on the objective functions.

  12. 48 CFR 15.101-2 - Lowest price technically acceptable source selection process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Lowest price technically acceptable source selection process. 15.101-2 Section 15.101-2 Federal Acquisition Regulations System FEDERAL... Processes and Techniques 15.101-2 Lowest price technically acceptable source selection process. (a) The...

  13. Due Process in Appraisal: A Quasi-Experiment in Procedural Justice.

    ERIC Educational Resources Information Center

    Taylor, M. Susan; And Others

    1995-01-01

    Extended research on procedural justice by examining effects of a due-process performance-appraisal system on (government) employees' and managers' reactions. Employee-management pairs were randomly assigned to either a due-process appraisal system or the existing one. Although due-process employees received lower evaluations, both employees and…

  14. Speech Perception as a Cognitive Process: The Interactive Activation Model.

    ERIC Educational Resources Information Center

    Elman, Jeffrey L.; McClelland, James L.

    Research efforts to model speech perception in terms of a processing system in which knowledge and processing are distributed over large numbers of highly interactive--but computationally primative--elements are described in this report. After discussing the properties of speech that demand a parallel interactive processing system, the report…

  15. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...

  16. A Java-based fMRI processing pipeline evaluation system for assessment of univariate general linear model and multivariate canonical variate analysis-based pipelines.

    PubMed

    Zhang, Jing; Liang, Lichen; Anderson, Jon R; Gatewood, Lael; Rottenberg, David A; Strother, Stephen C

    2008-01-01

    As functional magnetic resonance imaging (fMRI) becomes widely used, the demands for evaluation of fMRI processing pipelines and validation of fMRI analysis results is increasing rapidly. The current NPAIRS package, an IDL-based fMRI processing pipeline evaluation framework, lacks system interoperability and the ability to evaluate general linear model (GLM)-based pipelines using prediction metrics. Thus, it can not fully evaluate fMRI analytical software modules such as FSL.FEAT and NPAIRS.GLM. In order to overcome these limitations, a Java-based fMRI processing pipeline evaluation system was developed. It integrated YALE (a machine learning environment) into Fiswidgets (a fMRI software environment) to obtain system interoperability and applied an algorithm to measure GLM prediction accuracy. The results demonstrated that the system can evaluate fMRI processing pipelines with univariate GLM and multivariate canonical variates analysis (CVA)-based models on real fMRI data based on prediction accuracy (classification accuracy) and statistical parametric image (SPI) reproducibility. In addition, a preliminary study was performed where four fMRI processing pipelines with GLM and CVA modules such as FSL.FEAT and NPAIRS.CVA were evaluated with the system. The results indicated that (1) the system can compare different fMRI processing pipelines with heterogeneous models (NPAIRS.GLM, NPAIRS.CVA and FSL.FEAT) and rank their performance by automatic performance scoring, and (2) the rank of pipeline performance is highly dependent on the preprocessing operations. These results suggest that the system will be of value for the comparison, validation, standardization and optimization of functional neuroimaging software packages and fMRI processing pipelines.

  17. Final Evaluation Report American Telephone and Telegraph Company, System V/MLS Release 1.2.0 Running on UNIX System V, Release 3.1.1 Rating Maintenance Plan

    DTIC Science & Technology

    1990-09-28

    message queues semaphores processes This report also discusses 630 MTG buffers which are system objects (as are the process table, u- area, etc ). 630...mechanism is associated with a corresponding set of " operation " system calls: mqgend, rmgrcv for messages, semop for semaphores , shmat, shmdt for... Semaphores Semaphores are objects that are used to implement a process synchronisation mechanism. System V semaphores are a generalization of the P and V

  18. Modernization of the automation control system of technological processes at the preparation plant in the conditions of technical re-equipment

    NASA Astrophysics Data System (ADS)

    Lyakhovets, M. V.; Wenger, K. G.; Myshlyaev, L. P.; Shipunov, M. V.; Grachev, V. V.; Melkozerov, M. Yu; Fairoshin, Sh A.

    2018-05-01

    The experience of modernization of the automation control system of technological processes at the preparation plant under the conditions of technical re-equipment of the preparation plant “Barzasskoye Tovarischestvo” LLC (Berezovsky) is considered. The automated process control systems (APCS), the modernization goals and the ways to achieve them are indicated, the main subsystems of the integrated APCS are presented, the enlarged functional and technical structure of the upgraded system is given. The procedure for commissioning an upgraded system is described.

  19. AOIPS water resources data management system

    NASA Technical Reports Server (NTRS)

    Vanwie, P.

    1977-01-01

    The text and computer-generated displays used to demonstrate the AOIPS (Atmospheric and Oceanographic Information Processing System) water resources data management system are investigated. The system was developed to assist hydrologists in analyzing the physical processes occurring in watersheds. It was designed to alleviate some of the problems encountered while investigating the complex interrelationships of variables such as land-cover type, topography, precipitation, snow melt, surface runoff, evapotranspiration, and streamflow rates. The system has an interactive image processing capability and a color video display to display results as they are obtained.

  20. The Kritzel System for handwriting interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, G.

    We present a new system for recognizing on-line cursive handwriting. The system, which is called the Kritzel System, has four features. First, the system characterizes handwriting as a sequence of feature vectors. Second, the system adapts to a particular writing style itself through a learning process. Third, the reasoning of the system is formulated in propositional logic with likelihoods. Fourth, the system can be readily linked with other English processing systems for lexical and contextual checking.

Top