Data Processing Center of Radioastron Project: 3 years of operation.
NASA Astrophysics Data System (ADS)
Shatskaya, Marina
ASC DATA PROCESSING CENTER (DPC) of Radioastron Project is a fail-safe complex centralized system of interconnected software/ hardware components along with organizational procedures. Tasks facing of the scientific data processing center are organization of service information exchange, collection of scientific data, storage of all of scientific data, data science oriented processing. DPC takes part in the informational exchange with two tracking stations in Pushchino (Russia) and Green Bank (USA), about 30 ground telescopes, ballistic center, tracking headquarters and session scheduling center. Enormous flows of information go to Astro Space Center. For the inquiring of enormous data volumes we develop specialized network infrastructure, Internet channels and storage. The computer complex has been designed at the Astro Space Center (ASC) of Lebedev Physical Institute and includes: - 800 TB on-line storage, - 2000 TB hard drive archive, - backup system on magnetic tapes (2000 TB); - 24 TB redundant storage at Pushchino Radio Astronomy Observatory; - Web and FTP servers, - DPC management and data transmission networks. The structure and functions of ASC Data Processing Center are fully adequate to the data processing requirements of the Radioastron Mission and has been successfully confirmed during Fringe Search, Early Science Program and first year of Key Science Program.
WFIRST: User and mission support at ISOC - IPAC Science Operations Center
NASA Astrophysics Data System (ADS)
Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Laine, Seppo; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin
2018-01-01
The science center for WFIRST is distributed between the Goddard Space Flight Center, the Infrared Processing and Analysis Center (IPAC) and the Space Telescope Science Institute (STScI). The main functions of the IPAC Science Operations Center (ISOC) are:* Conduct the GO, archival and theory proposal submission and evaluation process* Support the coronagraph instrument, including observation planning, calibration and data processing pipeline, generation of data products, and user support* Microlensing survey data processing pipeline, generation of data products, and user support* Community engagement including conferences, workshops and general support of the WFIRST exoplanet communityWe will describe the components planned to support these functions and the community of WFIRST users.
Landsat data availability from the EROS Data Center and status of future plans
Pohl, Russell A.; Metz, G.G.
1977-01-01
The Department of Interior's EROS Data Center, managed by the U.S. Geological Survey, was established in 1972, in Sioux Falls, South Dakota, to serve as a principal dissemination facility for Landsat and other remotely Sensed data. Through the middle of 1977, the Center has supplied approximately 1.7 million copies of images from the more than 5 million images of the Earth's surface archived at the Center. Landsat accounted for half of these images plus approximately 5,800 computer-compatible tapes of Landsat data were also supplied to users. New methods for processing data products to make them more useful are being developed, and new accession aids for determining data availability are being placed in operation. The Center also provides assistance and training to resource specialists and land managers in the use of Landsat and other remotely sensed data. A Data Analysis Laboratory is operated at the Center to provide both digital and analog multispectral/multitemporal image analysis capabilities in support of the training and assistance programs. In addition to conventionally processed data products, radiometrically enhanced Landsat imagery are now available from the Center in limited quantities. In mid-1978, the Center will convert to an all-digital processing system for Landsat data that will provide improved products for user analysis in production quantities. The Department of Interior and NASA are currently studying concepts that use communication satellites to relay Landsat data between U.S. ground stations, Goddard Space Flight Center and the EROS Data Center which would improve the timeliness of data availability. The Data Center also works closely with the remote sensing programs and Landsat data receiving and processing facilities being developed in foreign countries.
Holkenbrink, Patrick F.
1978-01-01
Landsat data are received by National Aeronautics and Space Administration (NASA) tracking stations and converted into digital form on high-density tapes (HDTs) by the Image Processing Facility (IPF) at the Goddard Space Flight Center (GSFC), Greenbelt, Maryland. The HDTs are shipped to the EROS Data Center (EDC) where they are converted into customer products by the EROS Data Center digital image processing system (EDIPS). This document describes in detail one of these products: the computer-compatible tape (CCT) produced from Landsat-1, -2, and -3 multispectral scanner (MSS) data and Landsat-3 only return-beam vidicon (RBV) data. Landsat-1 and -2 RBV data will not be processed by IPF/EDIPS to CCT format.
A Feasibility Study of Providing Regional Data Processing Services.
ERIC Educational Resources Information Center
Nelson, Norbert J.; And Others
A Title III ESEA study sought to determine the feasibility of establishing a central data processing service by the Wabash Valley Education Center for its member schools. First, current applications of data processing in education were reviewed to acquire detailed specifications for an educational data processing center's hardware, software, and…
Federated Giovanni: A Distributed Web Service for Analysis and Visualization of Remote Sensing Data
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2014-01-01
The Geospatial Interactive Online Visualization and Analysis Interface (Giovanni) is a popular tool for users of the Goddard Earth Sciences Data and Information Services Center (GES DISC) and has been in use for over a decade. It provides a wide variety of algorithms and visualizations to explore large remote sensing datasets without having to download the data and without having to write readers and visualizers for it. Giovanni is now being extended to enable its capabilities at other data centers within the Earth Observing System Data and Information System (EOSDIS). This Federated Giovanni will allow four other data centers to add and maintain their data within Giovanni on behalf of their user community. Those data centers are the Physical Oceanography Distributed Active Archive Center (PO.DAAC), MODIS Adaptive Processing System (MODAPS), Ocean Biology Processing Group (OBPG), and Land Processes Distributed Active Archive Center (LP DAAC). Three tiers are supported: Tier 1 (GES DISC-hosted) gives the remote data center a data management interface to add and maintain data, which are provided through the Giovanni instance at the GES DISC. Tier 2 packages Giovanni up as a virtual machine for distribution to and deployment by the other data centers. Data variables are shared among data centers by sharing documents from the Solr database that underpins Giovanni's data management capabilities. However, each data center maintains their own instance of Giovanni, exposing the variables of most interest to their user community. Tier 3 is a Shared Source model, in which the data centers cooperate to extend the infrastructure by contributing source code.
NASA Astrophysics Data System (ADS)
Jiang, Yingni
2018-03-01
Due to the high energy consumption of communication, energy saving of data centers must be enforced. But the lack of evaluation mechanisms has restrained the process on energy saving construction of data centers. In this paper, energy saving evaluation index system of data centers was constructed on the basis of clarifying the influence factors. Based on the evaluation index system, analytical hierarchy process was used to determine the weights of the evaluation indexes. Subsequently, a three-grade fuzzy comprehensive evaluation model was constructed to evaluate the energy saving system of data centers.
Automated Data Submission for the Data Center
NASA Astrophysics Data System (ADS)
Wright, D.; Beaty, T.; Wei, Y.; Shanafield, H.; Santhana Vannan, S. K.
2014-12-01
Data centers struggle with difficulties related to data submission. Data are acquired through many avenues by many people. Many data submission activities involve intensive manual processes. During the submission process, data end up on varied storage devices. The situation can easily become chaotic. Collecting information on the status of pending data sets is arduous. For data providers, the submission process can be inconsistent and confusing. Scientists generally provide data from previous projects, and archival can be a low priority. Incomplete or poor documentation accompanies many data sets. However, complicated questionnaires deter busy data providers. At the ORNL DAAC, we have semi-automated the data set submission process to create a uniform data product and provide a consistent data provider experience. The formalized workflow makes archival faster for the data center and data set submission easier for data providers. Software modules create a flexible, reusable submission package. Formalized data set submission provides several benefits to the data center. A single data upload area provides one point of entry and ensures data are stored in a consistent location. A central dashboard records pending data set submissions in a single table and simplifies reporting. Flexible role management allows team members to readily coordinate and increases efficiency. Data products and metadata become uniform and easily maintained. As data and metadata standards change, modules can be modified or re-written without affecting workflow. While each data center has unique challenges, the data ingestion process is generally the same: get data from the provider, scientist, or project and capture metadata pertinent to that data. The ORNL DAAC data set submission workflow and software modules can be reused entirely or in part by other data centers looking for a data set submission solution. These data set submission modules will be available on NASA's Earthdata Code Collaborative and by request.
THE WASHINGTON DATA PROCESSING TRAINING STORY.
ERIC Educational Resources Information Center
MCKEE, R.L.
A DATA PROCESSING TRAINING PROGRAM IN WASHINGTON HAD 10 DATA PROCESSING CENTERS IN OPERATION AND EIGHT MORE IN VARIOUS STAGES OF PLANNING IN 1963. THESE CENTERS WERE FULL-TIME DAY PREPARATORY 2-YEAR POST-HIGH SCHOOL TECHNICIAN TRAINING PROGRAMS, OPERATED AND ADMINISTERED BY THE LOCAL BOARDS OF EDUCATION. EACH SCHOOL HAD A COMPLETE DATA PROCESSING…
What Does it Mean to Publish Data in Earth System Science Data Journal?
NASA Astrophysics Data System (ADS)
Carlson, D.; Pfeiffenberger, H.
2015-12-01
The availability of more than 120 data sets in ESSD represents an unprecedented effort by providers, data centers and ESSD. ESSD data sets and their accompanying data descriptions undergo rigorous review. The data sets reside at any of more than 20 cooperating data centers. The ESSD publication process depends on but challenges the concepts of digital object identification and exacerbates the varied interpretations of the phrase 'data publication'. ESSD adopts the digital object identifier (doi). Key questions apply to doi's and other identifiers. How will persistent identifiers point accurately to distributed or replicated data? How should data centers and data publishers use identifier technologies to ensure authenticity and integrity? Should metadata associated with identifiers distinguish among raw, quality controlled and derived data processing levels, or indicate license or copyright status?Data centers publish data sets according to internal metadata standards but without indicators of quality control. Publication in this sense indicates availability. National data portals compile, serve and publish data products as a service to national researchers and, often, to meet national requirements. Publication in this second case indicates availability in a national context; the data themselves may still reside at separate data centers. Data journals such as ESSD or Scientific Data publish peer-reviewed, quality controlled data sets. These data sets almost always reside at a separate data center - the journal and the center maintain explicit identifier linkages. Data journals add quality to the feature of availability. A single data set processed through these layers will generate three independent doi's but the doi's will provide little information about availability or quality. Could the data world learn from the URL world to consider additions? Suffixes? Could we use our experience with processing levels or data maturity to propose and agree such extensions?
Reliability Analysis and Standardization of Spacecraft Command Generation Processes
NASA Technical Reports Server (NTRS)
Meshkat, Leila; Grenander, Sven; Evensen, Ken
2011-01-01
center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.
Kepler Science Operations Center Architecture
NASA Technical Reports Server (NTRS)
Middour, Christopher; Klaus, Todd; Jenkins, Jon; Pletcher, David; Cote, Miles; Chandrasekaran, Hema; Wohler, Bill; Girouard, Forrest; Gunter, Jay P.; Uddin, Kamal;
2010-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Data Pipeline. Designed, developed, operated, and maintained by the Science Operations Center (SOC) at NASA Ames Research Center, the Kepler Science Data Pipeline is central element of the Kepler Ground Data System. The SOC charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Data Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center that hosts the computers required to perform data analysis. We discuss the high-performance, parallel computing software modules of the Kepler Science Data Pipeline that perform transit photometry, pixel-level calibration, systematic error-correction, attitude determination, stellar target management, and instrument characterization. We explain how data processing environments are divided to support operational processing and test needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science Data Pipeline.
The X-33 range Operations Control Center
NASA Technical Reports Server (NTRS)
Shy, Karla S.; Norman, Cynthia L.
1998-01-01
This paper describes the capabilities and features of the X-33 Range Operations Center at NASA Dryden Flight Research Center. All the unprocessed data will be collected and transmitted over fiber optic lines to the Lockheed Operations Control Center for real-time flight monitoring of the X-33 vehicle. By using the existing capabilities of the Western Aeronautical Test Range, the Range Operations Center will provide the ability to monitor all down-range tracking sites for the Extended Test Range systems. In addition to radar tracking and aircraft telemetry data, the Telemetry and Radar Acquisition and Processing System is being enhanced to acquire vehicle command data, differential Global Positioning System corrections and telemetry receiver signal level status. The Telemetry and Radar Acquisition Processing System provides the flexibility to satisfy all X-33 data processing requirements quickly and efficiently. Additionally, the Telemetry and Radar Acquisition Processing System will run a real-time link margin analysis program. The results of this model will be compared in real-time with actual flight data. The hardware and software concepts presented in this paper describe a method of merging all types of data into a common database for real-time display in the Range Operations Center in support of the X-33 program. All types of data will be processed for real-time analysis and display of the range system status to ensure public safety.
Land processes distributed active archive center product lifecycle plan
Daucsavage, John C.; Bennett, Stacie D.
2014-01-01
The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center and the National Aeronautics and Space Administration (NASA) Earth Science Data System Program worked together to establish, develop, and operate the Land Processes (LP) Distributed Active Archive Center (DAAC) to provide stewardship for NASA’s land processes science data. These data are critical science assets that serve the land processes science community with potential value beyond any immediate research use, and therefore need to be accounted for and properly managed throughout their lifecycle. A fundamental LP DAAC objective is to enable permanent preservation of these data and information products. The LP DAAC accomplishes this by bridging data producers and permanent archival resources while providing intermediate archive services for data and information products.
The Kepler Science Data Processing Pipeline Source Code Road Map
NASA Technical Reports Server (NTRS)
Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima;
2016-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Modeling Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Environmental Modeling Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research
[Automated processing of data from the 1985 population and housing census].
Cholakov, S
1987-01-01
The author describes the method of automated data processing used in the 1985 census of Bulgaria. He notes that the computerization of the census involves decentralization and the use of regional computing centers as well as data processing at the Central Statistical Office's National Information Computer Center. Special attention is given to problems concerning the projection and programming of census data. (SUMMARY IN ENG AND RUS)
Measurements of the center-of-mass energies at BESIII via the di-muon process
Ablikim, M.; Achasov, M. N.; Ai, X. C.; ...
2016-06-01
From 2011 to 2014, the BESIII experiment collected about 5 fb -1 data at center-of-mass energies around 4 GeV for the studies of the charmonium-like and higher excited charmonium states. By analyzing the di-muon process e +e - → γ ISR/FSRμ +μ -, the center-of-mass energies of the data samples are measured with a precision of 0.8 MeV. The center-of-mass energy is found to be stable for most of the time during data taking.
ERTS operations and data processing
NASA Technical Reports Server (NTRS)
Gonzales, L.; Sos, J. Y.
1974-01-01
The overall communications and data flow between the ERTS spacecraft and the ground stations and processing centers are generally described. Data from the multispectral scanner and the return beam vidicon are telemetered to a primary ground station where they are demodulated, processed, and recorded. The tapes are then transferred to the NASA Data Processing Facility (NDPF) at Goddard. Housekeeping data are relayed from the prime ground stations to the Operations Control Center at Goddard. Tracking data are processed at the ground stations, and the calculated parameters are transmitted by teletype to the orbit determination group at Goddard. The ERTS orbit has been designed so that the same swaths of the ground coverage pattern viewed during one 18-day coverage cycle are repeated by the swaths viewed on all subsequent cycles. The Operations Control Center is the focal point for all communications with the spacecraft. NDPF is a job-oriented facility which processes and stores all sensor data, and which disseminates large quantities of these data to users in the form of films, computer-compatible tapes, and data collection system data.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court College Park
Carneggie, David M.; Metz, Gary G.; Draeger, William C.; Thompson, Ralph J.
1991-01-01
The U.S. Geological Survey's Earth Resources Observation Systems (EROS) Data Center, the national archive for Landsat data, has 20 years of experience in acquiring, archiving, processing, and distributing Landsat and earth science data. The Center is expanding its satellite and earth science data management activities to support the U.S. Global Change Research Program and the National Aeronautics and Space Administration (NASA) Earth Observing System Program. The Center's current and future data management activities focus on land data and include: satellite and earth science data set acquisition, development and archiving; data set preservation, maintenance and conversion to more durable and accessible archive medium; development of an advanced Land Data Information System; development of enhanced data packaging and distribution mechanisms; and data processing, reprocessing, and product generation systems.
Archiving, processing, and disseminating ASTER products at the USGS EROS Data Center
Jones, B.; Tolk, B.; ,
2002-01-01
The U.S. Geological Survey EROS Data Center archives, processes, and disseminates Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data products. The ASTER instrument is one of five sensors onboard the Earth Observing System's Terra satellite launched December 18, 1999. ASTER collects broad spectral coverage with high spatial resolution at near infrared, shortwave infrared, and thermal infrared wavelengths with ground resolutions of 15, 30, and 90 meters, respectively. The ASTER data are used in many ways to understand local and regional earth-surface processes. Applications include land-surface climatology, volcanology, hazards monitoring, geology, agronomy, land cover change, and hydrology. The ASTER data are available for purchase from the ASTER Ground Data System in Japan and from the Land Processes Distributed Active Archive Center in the United States, which receives level 1A and level 1B data from Japan on a routine basis. These products are archived and made available to the public within 48 hours of receipt. The level 1A and level 1B data are used to generate higher level products that include routine and on-demand decorrelation stretch, brightness temperature at the sensor, emissivity, surface reflectance, surface kinetic temperature, surface radiance, polar surface and cloud classification, and digital elevation models. This paper describes the processes and procedures used to archive, process, and disseminate standard and on-demand higher level ASTER products at the Land Processes Distributed Active Archive Center.
A Data Accounting System for Clinical Investigators
Kashner, T. Michael; Hinson, Robert; Holland, Gloria J.; Mickey, Don D.; Hoffman, Keith; Lind, Lisa; Johnson, Linda D.; Chang, Barbara K.; Golden, Richard M.; Henley, Steven S.
2007-01-01
Clinical investigators often preprocess, process, and analyze their data without benefit of formally organized research centers to oversee data management. This article outlines a practical three-file structure to help guide these investigators track and document their data through processing and analyses. The proposed process can be implemented without additional training or specialized software. Thus, it is particularly well suited for research projects with small budgets or limited access to viable research/data coordinating centers. PMID:17460138
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
ERIC Educational Resources Information Center
Sherman, Don; Shoffner, Ralph M.
The scope of the California State Library-Processing Center (CSL-PC) project is to develop the design and specifications for a computerized technical processing center to provide services to a network of participating California libraries. Immediate objectives are: (1) retrospective conversion of card catalogs to a machine-form data base,…
An overview of the National Space Science data Center Standard Information Retrieval System (SIRS)
NASA Technical Reports Server (NTRS)
Shapiro, A.; Blecher, S.; Verson, E. E.; King, M. L. (Editor)
1974-01-01
A general overview is given of the National Space Science Data Center (NSSDC) Standard Information Retrieval System. A description, in general terms, the information system that contains the data files and the software system that processes and manipulates the files maintained at the Data Center. Emphasis is placed on providing users with an overview of the capabilities and uses of the NSSDC Standard Information Retrieval System (SIRS). Examples given are taken from the files at the Data Center. Detailed information about NSSDC data files is documented in a set of File Users Guides, with one user's guide prepared for each file processed by SIRS. Detailed information about SIRS is presented in the SIRS Users Guide.
Seeking Humanizing Care in Patient-Centered Care Process: A Grounded Theory Study.
Cheraghi, Mohammad Ali; Esmaeili, Maryam; Salsali, Mahvash
Patient-centered care is both a goal in itself and a tool for enhancing health outcomes. The application of patient-centered care in health care services globally however is diverse. This article reports on a study that sought to introduce patient-centered care. The aim of this study is to explore the process of providing patient-centered care in critical care units. The study used a grounded theory method. Data were collected on 5 critical care units in Tehran University of Medical Sciences. Purposive and theoretical sampling directed the collection of data using 29 semistructured interviews with 27 participants (nurses, patients, and physician). Data obtained were analyzed according to the analysis stages of grounded theory and constant comparison to identify the concepts, context, and process of the study. The core category of this grounded theory is "humanizing care," which consisted of 4 interrelated phases, including patient acceptance, purposeful patient assessment and identification, understanding patients, and patient empowerment. A core category of humanizing care integrated the theory. Humanizing care was an outcome and process. Patient-centered care is a dynamic and multifaceted process provided according to the nurses' understanding of the concept. Patient-centered care does not involve repeating routine tasks; rather, it requires an all-embracing understanding of the patients and showing respect for their values, needs, and preferences.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar conducts a program of research and development in support of the National Centers for Environmental Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court College Park
Intelligent Control of Micro Grid: A Big Data-Based Control Center
NASA Astrophysics Data System (ADS)
Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng
2018-01-01
In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.
NASA Astrophysics Data System (ADS)
Walter, R. J.; Protack, S. P.; Harris, C. J.; Caruthers, C.; Kusterer, J. M.
2008-12-01
NASA's Atmospheric Science Data Center at the NASA Langley Research Center performs all of the science data processing for the Multi-angle Imaging SpectroRadiometer (MISR) instrument. MISR is one of the five remote sensing instruments flying aboard NASA's Terra spacecraft. From the time of Terra launch in December 1999 until February 2008, all MISR science data processing was performed on a Silicon Graphics, Inc. (SGI) platform. However, dramatic improvements in commodity computing technology coupled with steadily declining project budgets during that period eventually made transitioning MISR processing to a commodity computing environment both feasible and necessary. The Atmospheric Science Data Center has successfully ported the MISR science data processing environment from the SGI platform to a Linux cluster environment. There were a multitude of technical challenges associated with this transition. Even though the core architecture of the production system did not change, the manner in which it interacted with underlying hardware was fundamentally different. In addition, there are more potential throughput bottlenecks in a cluster environment than there are in a symmetric multiprocessor environment like the SGI platform and each of these had to be addressed. Once all the technical issues associated with the transition were resolved, the Atmospheric Science Data Center had a MISR science data processing system with significantly higher throughput than the SGI platform at a fraction of the cost. In addition to the commodity hardware, free and open source software such as S4PM, Sun Grid Engine, PostgreSQL and Ganglia play a significant role in the new system. Details of the technical challenges and resolutions, software systems, performance improvements, and cost savings associated with the transition will be discussed. The Atmospheric Science Data Center in Langley's Science Directorate leads NASA's program for the processing, archival and distribution of Earth science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The Data Center was established in 1991 to support NASA's Earth Observing System and the U.S. Global Change Research Program. It is unique among NASA data centers in the size of its archive, cutting edge computing technology, and full range of data services. For more information regarding ASDC data holdings, documentation, tools and services, visit http://eosweb.larc.nasa.gov
EOS MLS Science Data Processing System: A Description of Architecture and Capabilities
NASA Technical Reports Server (NTRS)
Cuddy, David T.; Echeverri, Mark D.; Wagner, Paul A.; Hanzel, Audrey T.; Fuller, Ryan A.
2006-01-01
This paper describes the architecture and capabilities of the Science Data Processing System (SDPS) for the EOS MLS. The SDPS consists of two major components--the Science Computing Facility and the Science Investigator-led Processing System. The Science Computing Facility provides the facilities for the EOS MLS Science Team to perform the functions of scientific algorithm development, processing software development, quality control of data products, and scientific analyses. The Science Investigator-led Processing System processes and reprocesses the science data for the entire mission and delivers the data products to the Science Computing Facility and to the Goddard Space Flight Center Earth Science Distributed Active Archive Center, which archives and distributes the standard science products.
NASA Technical Reports Server (NTRS)
1982-01-01
The format of the HDT-AM product which contains partially processed LANDSAT D and D Prime multispectral scanner image data is defined. Recorded-data formats, tape format, and major frame types are described.
Average Throughput Performance of Myopic Policy in Energy Harvesting Wireless Sensor Networks.
Gul, Omer Melih; Demirekler, Mubeccel
2017-09-26
This paper considers a single-hop wireless sensor network where a fusion center collects data from M energy harvesting wireless sensors. The harvested energy is stored losslessly in an infinite-capacity battery at each sensor. In each time slot, the fusion center schedules K sensors for data transmission over K orthogonal channels. The fusion center does not have direct knowledge on the battery states of sensors, or the statistics of their energy harvesting processes. The fusion center only has information of the outcomes of previous transmission attempts. It is assumed that the sensors are data backlogged, there is no battery leakage and the communication is error-free. An energy harvesting sensor can transmit data to the fusion center whenever being scheduled only if it has enough energy for data transmission. We investigate average throughput of Round-Robin type myopic policy both analytically and numerically under an average reward (throughput) criterion. We show that Round-Robin type myopic policy achieves optimality for some class of energy harvesting processes although it is suboptimal for a broad class of energy harvesting processes.
Average Throughput Performance of Myopic Policy in Energy Harvesting Wireless Sensor Networks
Demirekler, Mubeccel
2017-01-01
This paper considers a single-hop wireless sensor network where a fusion center collects data from M energy harvesting wireless sensors. The harvested energy is stored losslessly in an infinite-capacity battery at each sensor. In each time slot, the fusion center schedules K sensors for data transmission over K orthogonal channels. The fusion center does not have direct knowledge on the battery states of sensors, or the statistics of their energy harvesting processes. The fusion center only has information of the outcomes of previous transmission attempts. It is assumed that the sensors are data backlogged, there is no battery leakage and the communication is error-free. An energy harvesting sensor can transmit data to the fusion center whenever being scheduled only if it has enough energy for data transmission. We investigate average throughput of Round-Robin type myopic policy both analytically and numerically under an average reward (throughput) criterion. We show that Round-Robin type myopic policy achieves optimality for some class of energy harvesting processes although it is suboptimal for a broad class of energy harvesting processes. PMID:28954420
TESS Ground System Operations and Data Products
NASA Astrophysics Data System (ADS)
Glidden, Ana; Guerrero, Natalia; Fausnaugh, Michael; TESS Team
2018-01-01
We describe the ground system operations for processing data from the Transiting Exoplanet Survey Satellite (TESS), highlighting the role of the Science Operations Center (SOC). TESS is a spaced-based (nearly) all-sky mission, designed to find small planets around nearby bright stars using the transit method. We detail the flow of data from pixel measurements on the instrument to final products available at the Mikulski Archive for Space Telescopes (MAST). The ground system relies on a host of players to process the data, including the Payload Operations Center at MIT, the Science Processing Operation Center at NASA Ames, and the TESS Science Office, led by the Harvard-Smithsonian Center for Astrophysics and MIT. Together, these groups will deliver TESS Input Catalog, instrument calibration models, calibrated target pixels and full frame images, threshold crossing event reports, two-minute light curves, and the TESS Objects of Interest List.
Update on the Center for Engineering Strong Motion Data
NASA Astrophysics Data System (ADS)
Haddadi, H. R.; Shakal, A. F.; Stephens, C. D.; Oppenheimer, D. H.; Huang, M.; Leith, W. S.; Parrish, J. G.; Savage, W. U.
2010-12-01
The U.S. Geological Survey (USGS) and the California Geological Survey (CGS) established the Center for Engineering Strong-Motion Data (CESMD, Center) to provide a single access point for earthquake strong-motion records and station metadata from the U.S. and international strong-motion programs. The Center has operational facilities in Sacramento and Menlo Park, California, to receive, process, and disseminate records through the CESMD web site at www.strongmotioncenter.org. The Center currently is in the process of transitioning the COSMOS Virtual Data Center (VDC) to integrate its functions with those of the CESMD for improved efficiency of operations, and to provide all users with a more convenient one-stop portal to both U.S. and important international strong-motion records. The Center is working with COSMOS and international and U.S. data providers to improve the completeness of site and station information, which are needed to most effectively employ the recorded data. The goal of all these and other new developments is to continually improve access by the earthquake engineering community to strong-motion data and metadata world-wide. The CESMD and its Virtual Data Center (VDC) provide tools to map earthquakes and recording stations, to search raw and processed data, to view time histories and spectral plots, to convert data files formats, and to download data and a variety of information. The VDC is now being upgraded to convert the strong-motion data files from different seismic networks into a common standard tagged format in order to facilitate importing earthquake records and station metadata to the CESMD database. An important new feature being developed is the automatic posting of Internet Quick Reports at the CESMD web site. This feature will allow users, and emergency responders in particular, to view strong-motion waveforms and download records within a few minutes after an earthquake occurs. Currently the CESMD and its Virtual Data Center provide selected strong-motion records from 17 countries. The Center has proved to be significantly useful for providing data to scientists, engineers, policy makers, and emergency response teams around the world.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregory Reaman
The initiative will enable the COG Biopathology Center (Biospecimen Repository), the Molecular Genetics Laboratory and other participating reference laboratories to upload large data sets to the eRDES. The capability streamlines data currency and accuracy allowing the centers to export data from local systems and import the defined data to the eRDES. The process will aid in the best practices which have been defined by the Office of Biorepository and Biospecimen Research (OBBR) and the Group Banking Committee (GBC). The initiative allows for batch import and export, a data validation process and reporting mechanism, and a model for other labs tomore » incorporate. All objectives are complete. The solutions provided and the defined process eliminates dual data entry resulting in data consistency. The audit trail capabilities allow for complete tracking of the data exchange between laboratories and the Statistical Data Center (SDC). The impact is directly on time and efforts. In return, the process will save money and improve the data utilized by the COG. Ongoing efforts include implementing new technologies to further enhance the current solutions and process currently in place. Web Services and Reporting Services are technologies that have become industry standards and will allow for further harmonization with caBIG (cancer Biolnforrnatics Grid). Additional testing and implementation of the model for other laboratories is in process.« less
Fields, Dail; Roman, Paul M; Blum, Terry C
2012-01-01
Objective To examine the relationships among general management systems, patient-focused quality management/continuous process improvement (TQM/CPI) processes, resource availability, and multiple dimensions of substance use disorder (SUD) treatment. Data Sources/Study Setting Data are from a nationally representative sample of 221 SUD treatment centers through the National Treatment Center Study (NTCS). Study Design The design was a cross-sectional field study using latent variable structural equation models. The key variables are management practices, TQM/continuous quality improvement (CQI) practices, resource availability, and treatment center performance. Data Collection Interviews and questionnaires provided data from treatment center administrative directors and clinical directors in 2007–2008. Principal Findings Patient-focused TQM/CQI practices fully mediated the relationship between internal management practices and performance. The effects of TQM/CQI on performance are significantly larger for treatment centers with higher levels of staff per patient. Conclusions Internal management practices may create a setting that supports implementation of specific patient-focused practices and protocols inherent to TQM/CQI processes. However, the positive effects of internal management practices on treatment center performance occur through use of specific patient-focused TQM/CPI practices and have more impact when greater amounts of supporting resources are present. PMID:22098342
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar WEATHER RESEARCH and FORECASTING HMON HMON - OPERATIONAL HURRICANE FORECASTING WAVEWATCH III WAVEWATCH III Modeling Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Booth, N.; Walker, J.; Kunicki, T.
2012-12-01
The U.S. Geological Survey Center for Integrated Data Analytics (CIDA), in holding with the President's Digital Government Strategy and the Department of Interior's IT Transformation initiative, has evolved its data center and application architecture toward the "cloud" paradigm. In this case, "cloud" refers to a goal of developing services that may be distributed to infrastructure anywhere on the Internet. This transition has taken place across the entire data management spectrum from data center location to physical hardware configuration to software design and implementation. In CIDA's case, physical hardware resides in Madison at the Wisconsin Water Science Center, in South Dakota at the Earth Resources Observation and Science Center (EROS), and in the near future at a DOI approved commercial vendor. Tasks normally conducted on desktop-based GIS software with local copies of data in proprietary formats are now done using browser-based interfaces to web processing services drawing on a network of standard data-source web services. Organizations are gaining economies of scale through data center consolidation and the creation of private cloud services as well as taking advantage of the commoditization of data processing services. Leveraging open standards for data and data management take advantage of this commoditization and provide the means to reliably build distributed service based systems. This presentation will use CIDA's experience as an illustration of the benefits and hurdles of moving to the cloud. Replicating, reformatting, and processing large data sets, such as downscaled climate projections, traditionally present a substantial challenge to environmental science researchers who need access to data subsets and derived products. The USGS Geo Data Portal (GDP) project uses cloud concepts to help earth system scientists' access subsets, spatial summaries, and derivatives of commonly needed very large data. The GDP project has developed a reusable architecture and advanced processing services that currently accesses archives hosted at Lawrence Livermore National Lab, Oregon State University, the University Corporation for Atmospheric Research, and the U.S. Geological Survey, among others. Several examples of how the GDP project uses cloud concepts will be highlighted in this presentation: 1) The high bandwidth network connectivity of large data centers reduces the need for data replication and storage local to processing services. 2) Standard data serving web services, like OPeNDAP, Web Coverage Services, and Web Feature Services allow GDP services to remotely access custom subsets of data in a variety of formats, further reducing the need for data replication and reformatting. 3) The GDP services use standard web service APIs to allow browser-based user interfaces to run complex and compute-intensive processes for users from any computer with an Internet connection. The combination of physical infrastructure and application architecture implemented for the Geo Data Portal project offer an operational example of how distributed data and processing on the cloud can be used to aid earth system science.
User's guide to the UTIL-ODRC tape processing program. [for the Orbital Data Reduction Center
NASA Technical Reports Server (NTRS)
Juba, S. M. (Principal Investigator)
1981-01-01
The UTIL-ODRC computer compatible tape processing program, its input/output requirements, and its interface with the EXEC 8 operating system are described. It is a multipurpose orbital data reduction center (ODRC) tape processing program enabling the user to create either exact duplicate tapes and/or tapes in SINDA/HISTRY format. Input data elements for PRAMPT/FLOPLT and/or BATCH PLOT programs, a temperature summary, and a printed summary can also be produced.
Web service activities at the IRIS DMC to support federated and multidisciplinary access
NASA Astrophysics Data System (ADS)
Trabant, Chad; Ahern, Timothy K.
2013-04-01
At the IRIS Data Management Center (DMC) we have developed a suite of web service interfaces to access our large archive of, primarily seismological, time series data and related metadata. The goals of these web services include providing: a) next-generation and easily used access interfaces for our current users, b) access to data holdings in a form usable for non-seismologists, c) programmatic access to facilitate integration into data processing workflows and d) a foundation for participation in federated data discovery and access systems. To support our current users, our services provide access to the raw time series data and metadata or conversions of the raw data to commonly used formats. Our services also support simple, on-the-fly signal processing options that are common first steps in many workflows. Additionally, high-level data products derived from raw data are available via service interfaces. To support data access by researchers unfamiliar with seismic data we offer conversion of the data to broadly usable formats (e.g. ASCII text) and data processing to convert the data to Earth units. By their very nature, web services are programmatic interfaces. Combined with ubiquitous support for web technologies in programming & scripting languages and support in many computing environments, web services are very well suited for integrating data access into data processing workflows. As programmatic interfaces that can return data in both discipline-specific and broadly usable formats, our services are also well suited for participation in federated and brokered systems either specific to seismology or multidisciplinary. Working within the International Federation of Digital Seismograph Networks, the DMC collaborated on the specification of standardized web service interfaces for use at any seismological data center. These data access interfaces, when supported by multiple data centers, will form a foundation on which to build discovery and access mechanisms for data sets spanning multiple centers. To promote the adoption of these standardized services the DMC has developed portable implementations of the software needed to host these interfaces, minimizing the work required at each data center. Within the COOPEUS project framework, the DMC is working with EU partners to install web services implementations at multiple data centers in Europe.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740
1983-06-01
LOSARDO Project Engineer APPROVED: .MARMCINIhI, Colonel. USAF Chief, Coaud and Control Division FOR THE CCOaIDKR: Acting Chief, Plea Off ice * **711...WORK UNIT NUMBERS General Dynamics Corporation 62702F Data Systems Division P 0 Box 748, Fort Worth TX 76101 55811829 I1. CONTROLLING OFFICE NAME AND...Processing System for 29 the Operation/Direction Center(s) 4-3 Distribution of Processing Control 30 for the Operation/Direction Center(s) 4-4 Generalized
Access routes to the U.S. Geological Survey's EROS Data Center, Sioux Falls, South Dakota
,
1976-01-01
The EROS Data Center is a part of the Earth Resources Observation Systems (EROS) Program of the Department of the Interior, managed by the U.S. Geological Survey. It is the national center for the processing anddissemination of spacecraft and aircraft acquired photographic imagery and electronic data of the Earth's resources. The center also trains and assists users in the application of such data. The EROS Data Center provides access to Landsat data, aerial photography acquired by the U.S. Department of the Interior, and photography and other remotely sensed data acquired by the National Aeronautics and Space Administration (NASA) from research aircraft and from Skylab, Apollo, and Gemini spacecraft.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar projects. Starting a Monsoon Mission experiment or research project? Let us know so we can add it to our Modeling Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court
NCALM: NSF Supported Center for Airborne Laser Mapping
NASA Astrophysics Data System (ADS)
Shrestha, R. L.; Carter, W. E.; Dietrich, W. E.
2003-12-01
The National Science Foundation (NSF) recently awarded a grant to create a research center to support the use of airborne laser mapping technology in the scientific community. The NSF supported Center for Airborne Laser Mapping (NCALM) will be operated jointly by the Department of Civil & Coastal Engineering, College of Engineering, University of Florida (UF) and the Department of Earth and Planetary Science, University of California-Berkeley (UCB). NCALM will use the Airborne Laser Swath Mapping (ALSM) system jointly owned by UF and Florida International University (FIU), based at the UF Geosensing Engineering and Mapping (GEM) Research Center. The state-of-the-art laser surveying instrumentation, GPS systems, which are installed in a Cessna 337 Skymaster aircraft, will collect research grade data in areas selected through the competitive NSF grant review process. The ALSM observations will be analyzed both at UF and UCB, and made available to the PI through an archiving and distribution center at UCB-building upon the Berkeley Seismological Laboratory (BSL) Northern California Earthquake Data Center system. The purpose of NCALM is to provide research grade data from ALSM technology to NSF supported research studies in geosciences. The Center will also contribute to software development that will increase the processing speed and data accuracy. This presentation will discuss NCALM operation and the process of submitting proposals to NSF. In addition, it will outline the process to request available NCALM seed project funds to help jump-start small scientific research studies. Funds are also available for travel by academic researchers and students for hands-on knowledge and experience in ALSM technology at UF and UCB.
The effective use of virtualization for selection of data centers in a cloud computing environment
NASA Astrophysics Data System (ADS)
Kumar, B. Santhosh; Parthiban, Latha
2018-04-01
Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.
NASA Astrophysics Data System (ADS)
Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.
2006-12-01
The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.
Industrial Assessment Center (IAC) Operations Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopalakrishnan, Bhaskaran; Nimbalkar, Sachin U.; Wenning, Thomas J.
IAC Operations Manual describes organizational model and operations of the Industrial Assessment Center (IAC), Center management activities, typical process of energy assessment, and energy assessment data for specific industry sectors.
The California Integrated Seismic Network
NASA Astrophysics Data System (ADS)
Hellweg, M.; Given, D.; Hauksson, E.; Neuhauser, D.; Oppenheimer, D.; Shakal, A.
2007-05-01
The mission of the California Integrated Seismic Network (CISN) is to operate a reliable, modern system to monitor earthquakes throughout the state; to generate and distribute information in real-time for emergency response, for the benefit of public safety, and for loss mitigation; and to collect and archive data for seismological and earthquake engineering research. To meet these needs, the CISN operates data processing and archiving centers, as well as more than 3000 seismic stations. Furthermore, the CISN is actively developing and enhancing its infrastructure, including its automated processing and archival systems. The CISN integrates seismic and strong motion networks operated by the University of California Berkeley (UCB), the California Institute of Technology (Caltech), and the United States Geological Survey (USGS) offices in Menlo Park and Pasadena, as well as the USGS National Strong Motion Program (NSMP), and the California Geological Survey (CGS). The CISN operates two earthquake management centers (the NCEMC and SCEMC) where statewide, real-time earthquake monitoring takes place, and an engineering data center (EDC) for processing strong motion data and making it available in near real-time to the engineering community. These centers employ redundant hardware to minimize disruptions to the earthquake detection and processing systems. At the same time, dual feeds of data from a subset of broadband and strong motion stations are telemetered in real- time directly to both the NCEMC and the SCEMC to ensure the availability of statewide data in the event of a catastrophic failure at one of these two centers. The CISN uses a backbone T1 ring (with automatic backup over the internet) to interconnect the centers and the California Office of Emergency Services. The T1 ring enables real-time exchange of selected waveforms, derived ground motion data, phase arrivals, earthquake parameters, and ShakeMaps. With the goal of operating similar and redundant statewide earthquake processing systems at both real-time EMCs, the CISN is currently adopting and enhancing the database-centric, earthquake processing and analysis software originally developed for the Caltech/USGS Pasadena TriNet project. Earthquake data and waveforms are made available to researchers and to the public in near real-time through the CISN's Northern and Southern California Eathquake Data Centers (NCEDC and SCEDC) and through the USGS Earthquake Notification System (ENS). The CISN partners have developed procedures to automatically exchange strong motion data, both waveforms and peak parameters, for use in ShakeMap and in the rapid engineering reports which are available near real-time through the strong motion EDC.
Telemetry distribution and processing for the second German Spacelab Mission D-2
NASA Technical Reports Server (NTRS)
Rabenau, E.; Kruse, W.
1994-01-01
For the second German Spacelab Mission D-2 all activities related to operating, monitoring and controlling the experiments on board the Spacelab were conducted from the German Space Operations Control Center (GSOC) operated by the Deutsche Forschungsanstalt fur Luft- und Raumfahrt (DLR) in Oberpfaffenhofen, Germany. The operational requirements imposed new concepts on the transfer of data between Germany and the NASA centers and the processing of data at the GSOC itself. Highlights were the upgrade of the Spacelab Data Processing Facility (SLDPF) to real time data processing, the introduction of packet telemetry and the development of the high-rate data handling front end, data processing and display systems at GSOC. For the first time, a robot on board the Spacelab was to be controlled from the ground in a closed loop environment. A dedicated forward channel was implemented to transfer the robot manipulation commands originating from the robotics experiment ground station to the Spacelab via the Orbiter's text and graphics system interface. The capability to perform telescience from an external user center was implemented. All interfaces proved successful during the course of the D-2 mission and are described in detail in this paper.
Functional and performance requirements of the next NOAA-Kasas City computer system
NASA Technical Reports Server (NTRS)
Mosher, F. R.
1985-01-01
The development of the Advanced Weather Interactive Processing System for the 1990's (AWIPS-90) will result in more timely and accurate forecasts with improved cost effectiveness. As part of the AWIPS-90 initiative, the National Meteorological Center (NMC), the National Severe Storms Forecast Center (NSSFC), and the National Hurricane Center (NHC) are to receive upgrades of interactive processing systems. This National Center Upgrade program will support the specialized inter-center communications, data acquisition, and processing needs of these centers. The missions, current capabilities and general functional requirements for the upgrade to the NSSFC are addressed. System capabilities are discussed along with the requirements for the upgraded system.
Precise orbit determination and station position estimation using DORIS RINEX data
NASA Astrophysics Data System (ADS)
Lemoine, Jean-Michel; Capdeville, Hugues; Soudarin, Laurent
2016-12-01
Within the frame of the International DORIS Service (IDS), the CNES/CLS Analysis Center contributes to the geodetic and geophysical research activity through DORIS data analysis. A strategy was developed for the processing of the measurements of the DGXX instruments in RINEX/DORIS format, as it will be the only type of DORIS format made available by CNES, starting with the Jason-3 and Sentinel-3A missions launched at the beginning of the year 2016. The purpose of this paper is to describe the method implemented in the CNES/CLS Analysis Center orbit computation software GINS to process RINEX/DORIS data files. Phase measurements are converted into Doppler counts and then into relative satellite-to-beacon velocities. In this approach, the iono-free phase centers have to be used as the end points of the measurement instead of the 2 GHz phase centers. Given that, the processing results with RINEX/DORIS data are similar to the ones obtained with the usual doris2.2 data, except the scale factor of the Terrestrial Reference Frame in the 7-parameter transform of the network solution. We also address the issue of the scale factor increase from 2012 observed by all the IDS Analysis Centers in their solutions for the ITRF2014 combination. We show that the scale increase in 2012 is dependent on the type of DORIS data used. This scale increase is also enhanced by a bias due to the inclusion of HY-2A data, but which can be eliminated by adopting different coordinates of the onboard DORIS antenna phase center.
Introduction to the U.S. Geological Survey's EROS Data Center Sioux Falls, South Dakota
Braconnier, L.A.; Wiepking, P.J.
1980-01-01
The EROS Data Center is a part of the Earth Resources Observation Systems (EROS) Office of the Department of the Interior and is managed by the U.S. Geological Survey. It is the national clearinghouse for the processing and dissemination of spacecraft- and aircraft-acquired images and photographs and electronic data on the Earth's resources. The Center also trains and assists users in the application of such data.
29. Perimeter acquisition radar building room #318, data processing system ...
29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND
Kennedy Space Center Launch and Landing Support
NASA Technical Reports Server (NTRS)
Wahlberg, Jennifer
2010-01-01
The presentations describes Kennedy Space Center (KSC) payload processing, facilities and capabilities, and research development and life science experience. Topics include launch site processing, payload processing, key launch site processing roles, leveraging KSC experience, Space Station Processing Facility and capabilities, Baseline Data Collection Facility, Space Life Sciences Laboratory and capabilities, research payload development, International Space Station research flight hardware, KSC flight payload history, and KSC life science expertise.
Saillant, N N; Earl-Royal, E; Pascual, J L; Allen, S R; Kim, P K; Delgado, M K; Carr, B G; Wiebe, D; Holena, D N
2017-02-01
Age is a risk factor for death, adverse outcomes, and health care use following trauma. The American College of Surgeons' Trauma Quality Improvement Program (TQIP) has published "best practices" of geriatric trauma care; adoption of these guidelines is unknown. We sought to determine which evidence-based geriatric protocols, including TQIP guidelines, were correlated with decreased mortality in Pennsylvania's trauma centers. PA's level I and II trauma centers self-reported adoption of geriatric protocols. Survey data were merged with risk-adjusted mortality data for patients ≥65 from a statewide database, the Pennsylvania Trauma Systems Foundation (PTSF), to compare mortality outlier status and processes of care. Exposures of interest were center-specific processes of care; outcome of interest was PTSF mortality outlier status. 26 of 27 eligible trauma centers participated. There was wide variation in care processes. Four trauma centers were low outliers; three centers were high outliers for risk-adjusted mortality rates in adults ≥65. Results remained consistent when accounting for center volume. The only process associated with mortality outlier status was age-specific solid organ injury protocols (p = 0.04). There was no cumulative effect of multiple evidence-based processes on mortality rate (p = 0.50). We did not see a link between adoption of geriatric best-practices trauma guidelines and reduced mortality at PA trauma centers. The increased susceptibility of elderly to adverse consequences of injury, combined with the rapid growth rate of this demographic, emphasizes the importance of identifying interventions tailored to this population. III. Descriptive.
The CNES Gaia Data Processing Center: A Challenge and its Solutions
NASA Astrophysics Data System (ADS)
Chaoul, Laurence; Valette, Veronique
2011-08-01
After a brief reminder of the ESA Gaia project, this paper presents the data processing consortium (DPAC) and then the CNES data processing centre (DPCC). We focus on the challenge in terms of organisational aspects, processing capabilities, databases volumetry, and how we deal with these topics.
Systems engineering considerations for operational support systems
NASA Technical Reports Server (NTRS)
Aller, Robert O.
1993-01-01
Operations support as considered here is the infrastructure of people, procedures, facilities and systems that provide NASA with the capability to conduct space missions. This infrastructure involves most of the Centers but is concentrated principally at the Johnson Space Center, the Kennedy Space Center, the Goddard Space Flight Center, and the Jet Propulsion Laboratory. It includes mission training and planning, launch and recovery, mission control, tracking, communications, data retrieval and data processing.
2005-12-01
data collected via on-board instrumentation -VxWorks based computer. Each instrument produces a continuous time history record of up to 250...data in multidimensional hierarchies and views. UGC 2005 Institute a high performance data warehouse • PostgreSQL 7.4 installed on dedicated filesystem
Romanian Complex Data Center for Dense Seismic network
NASA Astrophysics Data System (ADS)
Neagoe, Cristian; Ionescu, Constantin; Marius Manea, Liviu
2010-05-01
Since 2002 the National Institute for Earth Physics (NIEP) developed its own real-time digital seismic network: consisting of 96 seismic stations of which 35 are broadband sensors and 24 stations equipped with short period sensors and two arrays earthquakes that transmit data in real time at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for Black Sea tsunamis. Seismic stations are equipped with Quanterra Q330 and K2 digitizers, broadband seismometers (STS2, CMG40T, CMG 3ESP, CMG3T) and acceleration sensors Episensor Kinemetrics (+ / - 2G). SeedLink who is a part of Seiscomp2.5 and Antelope are software packages used for acquisition in real time (RT) and for data exchange. Communication of digital seismic stations to the National Data Center in Bucharest and Seismic Observatory Eforie Nord is assured by 5 providers (GPRS, VPN, satellite radio and Internet communication). For acquisition and data processing at the two centers of reception and processing is used AntelopeTM 4.11 running on 2 workstations: one for real-time and other for offline processing and also a Seiscomp 3 server that works as back-up for Antelope 4.11 Both acquisition and analysis of seismic data systems produced information about local and global parameters of earthquakes, in addition Antelope is used for manual processing (association events, the calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV , etc.), generating ShakeMap products and interacts with global data centers. In order to make all this information easily available across the Web and also lay the grounds for a more modular and flexible development environment the National Data Center developed tools to enable centralizing of data from software such as Antelope which is using a dedicated database system ( Datascope, a database system based on text files ) to a more general-purpose database, MySQL which acts like a hub between the different acquisition and analysis systems used in the data center while also providing better connectivity at no expense in security. Mirroring certain data to MySQL also allows the National Data Center to easily share information to the public via the new application which is being developed and also mix in data collected from the public (e.g. information about the damages observed after an earthquake which intern is being used to produce macroseismic intensity indices which are then stored in the database and also made available via the web application). For internal usage there is also a web application which using data stored in the database displays earthquake information like location, magnitude and depth in semi-real-time thus aiding the personnel on duty. Another usage for the collected data is to create and maintain contact lists to which the datacenter sends notifications (SMS and emails) based on the parameters of the earthquake. For future development, amongst others the Data Center plans to develop the means to crosscheck the generated data between the different acquisition and analysis systems (e.g. comparing data generated by Antelope with data generated by Seiscomp).
The SPHERE Data Center: a reference for high contrast imaging processing
NASA Astrophysics Data System (ADS)
Delorme, P.; Meunier, N.; Albert, D.; Lagadec, E.; Le Coroller, H.; Galicher, R.; Mouillet, D.; Boccaletti, A.; Mesa, D.; Meunier, J.-C.; Beuzit, J.-L.; Lagrange, A.-M.; Chauvin, G.; Sapone, A.; Langlois, M.; Maire, A.-L.; Montargès, M.; Gratton, R.; Vigan, A.; Surace, C.
2017-12-01
The objective of the SPHERE Data Center is to optimize the scientific return of SPHERE at the VLT, by providing optimized reduction procedures, services to users and publicly available reduced data. This paper describes our motivation, the implementation of the service (partners, infrastructure and developments), services, description of the on-line data, and future developments. The SPHERE Data Center is operational and has already provided reduced data with a good reactivity to many observers. The first public reduced data have been made available in 2017. The SPHERE Data Center is gathering a strong expertise on SPHERE data and is in a very good position to propose new reduced data in the future, as well as improved reduction procedures.
PCs: Key to the Future. Business Center Provides Sound Skills and Good Attitudes.
ERIC Educational Resources Information Center
Pay, Renee W.
1991-01-01
The Advanced Computing/Management Training Program at Jordan Technical Center (Sandy, Utah) simulates an automated office to teach five sets of skills: computer architecture and operating systems, word processing, data processing, communications skills, and management principles. (SK)
The Future is Hera! Analyzing Astronomical Over the Internet
NASA Technical Reports Server (NTRS)
Valencic, L. A.; Chai, P.; Pence, W.; Shafer, R.; Snowden, S.
2008-01-01
Hera is the data processing facility provided by the High Energy Astrophysics Science Archive Research Center (HEASARC) at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the pre-installed software packages, local disk space, and computing resources need to do general processing of FITS format data files residing on the users local computer, and to do research using the publicly available data from the High ENergy Astrophysics Division. Qualified students, educators and researchers may freely use the Hera services over the internet of research and educational purposes.
2010-04-01
NRL Stennis Space Center (NRL-SSC) for further processing using the NRL SSC Automated Processing System (APS). APS was developed for processing...have not previously developed automated processing for 73 hyperspectral ocean color data. The hyperspectral processing branch includes several
The Land Processes Distributed Active Archive Center (LP DAAC)
Golon, Danielle K.
2016-10-03
The Land Processes Distributed Active Archive Center (LP DAAC) operates as a partnership with the U.S. Geological Survey and is 1 of 12 DAACs within the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS). The LP DAAC ingests, archives, processes, and distributes NASA Earth science remote sensing data. These data are provided to the public at no charge. Data distributed by the LP DAAC provide information about Earth’s surface from daily to yearly intervals and at 15 to 5,600 meter spatial resolution. Data provided by the LP DAAC can be used to study changes in agriculture, vegetation, ecosystems, elevation, and much more. The LP DAAC provides several ways to access, process, and interact with these data. In addition, the LP DAAC is actively archiving new datasets to provide users with a variety of data to study the Earth.
Energy Demands and Efficiency Strategies in Data Center Buildings
ERIC Educational Resources Information Center
Shehabi, Arman
2009-01-01
Information technology (IT) is becoming increasingly pervasive throughout society as more data is digitally processed, stored, and transferred. The infrastructure that supports IT activity is growing accordingly, and data center energy demands have increased by nearly a factor of four over the past decade. This dissertation investigates how…
National Fuel Cell Technology Evaluation Center | Hydrogen and Fuel Cells |
NREL National Fuel Cell Technology Evaluation Center National Fuel Cell Technology Evaluation Center The National Fuel Cell Technology Evaluation Center (NFCTEC) at NREL's Energy Systems Integration Cell Technology Evaluation Center to process and analyze data for a variety of hydrogen and fuel cell
NASA Astrophysics Data System (ADS)
Gendre, B.; Giommi, P.
2010-12-01
The ASI Science Data Center (ASDC, www.asdc.asi.it), a facility of the Italian Space Agency (ASI) is a multi-mission science operations, data processing and data archiving center that provides support to several scientific space missions. At the moment the ASDC has significant responsibilities for a number of high-energy astronomy/astroparticle satellites (e.g. Swift, AGILE, Fermi, NuSTAR and AMS) and supports at different level other missions like, Herschel and Planck. The ASDC was established in 2000 based on the experience built with the management of the BeppoSAX Science Data Center. It is located at the ESA site of ESRIN in Frascati, near Rome (Italy).
NASA Astrophysics Data System (ADS)
Mungov, G.; Dunbar, P. K.; Stroker, K. J.; Sweeney, A.
2016-12-01
The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information is data repository for high-resolution, integrated water-level data to support tsunami research, risk assessment and mitigation to protect life and property damages along the coasts. NCEI responsibilities include, but are not limited to process, archiv and distribut and coastal water level data from different sourcesg tsunami and storm-surge inundation, sea-level change, climate variability, etc. High-resolution data for global historical tsunami events are collected by the Deep-ocean Assessment and Reporting of Tsunami (DART®) tsunameter network maintained by NOAA's National Data Buoy Center NDBC, coastal tide-gauges maintained by NOAA's Center for Operational Oceanographic Products and Services (CO-OPS) and Tsunami Warning Centers, historic marigrams and images, bathymetric data, and from other national and international sources. NCEI-CO water level database is developed in close collaboration with all data providers along with NOAA's Pacific Marine Environmental Laboratory. We outline here the present state in water-level data processing regarding the increasing needs for high-precision, homogeneous and "clean" tsunami records from data different sources and different sampling interval. Two tidal models are compared: the Mike Foreman's improved oceanographic model (2009) and the Akaike Bayesian Information Criterion approach applied by Tamura et al. (1991). The effects of filtering and the limits of its application are also discussed along with the used method for de-spiking the raw time series.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar conducts a program of research and development in support of the National Centers for Environmental Prediction (NCEP) operational forecasting mission for global prediction. This research and development in
Davenport, Paul B; Carter, Kimberly F; Echternach, Jeffrey M; Tuck, Christopher R
2018-02-01
High-reliability organizations (HROs) demonstrate unique and consistent characteristics, including operational sensitivity and control, situational awareness, hyperacute use of technology and data, and actionable process transformation. System complexity and reliance on information-based processes challenge healthcare organizations to replicate HRO processes. This article describes a healthcare organization's 3-year journey to achieve key HRO features to deliver high-quality, patient-centric care via an operations center powered by the principles of high-reliability data and software to impact patient throughput and flow.
Data management for community research projects: A JGOFS case study
NASA Technical Reports Server (NTRS)
Lowry, Roy K.
1992-01-01
Since the mid 1980s, much of the marine science research effort in the United Kingdom has been focused into large scale collaborative projects involving public sector laboratories and university departments, termed Community Research Projects. Two of these, the Biogeochemical Ocean Flux Study (BOFS) and the North Sea Project incorporated large scale data collection to underpin multidisciplinary modeling efforts. The challenge of providing project data sets to support the science was met by a small team within the British Oceanographic Data Centre (BODC) operating as a topical data center. The role of the data center was to both work up the data from the ship's sensors and to combine these data with sample measurements into online databases. The working up of the data was achieved by a unique symbiosis between data center staff and project scientists. The project management, programming and data processing skills of the data center were combined with the oceanographic experience of the project communities to develop a system which has produced quality controlled, calibrated data sets from 49 research cruises in 3.5 years of operation. The data center resources required to achieve this were modest and far outweighed by the time liberated in the scientific community by the removal of the data processing burden. Two online project databases have been assembled containing a very high proportion of the data collected. As these are under the control of BODC their long term availability as part of the UK national data archive is assured. The success of the topical data center model for UK Community Research Project data management has been founded upon the strong working relationships forged between the data center and project scientists. These can only be established by frequent personal contact and hence the relatively small size of the UK has been a critical factor. However, projects covering a larger, even international scale could be successfully supported by a network of topical data centers managing online databases which are interconnected by object oriented distributed data management systems over wide area networks.
NASA Wallops Flight Center GEOS-3 altimeter data processing report
NASA Technical Reports Server (NTRS)
Stanley, H. R.; Dwyer, R. E.
1980-01-01
The procedures used to process the GEOS-3 radar altimeter data from raw telemetry data to a final user data product are described. In addition, the radar altimeter hardware design and operating parameters are presented to aid the altimeter user in understanding the altimeter data.
Petascale Computing for Ground-Based Solar Physics with the DKIST Data Center
NASA Astrophysics Data System (ADS)
Berukoff, Steven J.; Hays, Tony; Reardon, Kevin P.; Spiess, DJ; Watson, Fraser; Wiant, Scott
2016-05-01
When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 3 PB per year, and produce 107-109 metadata elements.The DKIST Data Center is being designed to store, curate, and process this flood of information, while providing association of science data and metadata to its acquisition and processing provenance. The Data Center will produce quality-controlled calibrated data sets, and make them available freely and openly through modern search interfaces and APIs. Documented software and algorithms will also be made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. We discuss our iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.
NASA Technical Reports Server (NTRS)
Beyon, Jeffrey Y.; Arthur, Grant E.; Koch, Grady J.; Kavaya, Michael J.
2012-01-01
Two different noise whitening methods in airborne wind profiling with a pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia are presented. In order to provide accurate wind parameter estimates from the airborne lidar data acquired during the NASA Genesis and Rapid Intensification Processes (GRIP) campaign in 2010, the adverse effects of background instrument noise must be compensated properly in the early stage of data processing. The results of the two methods are presented using selected GRIP data and compared with the dropsonde data for verification purposes.
Facilities | Transportation Research | NREL
detailed chemical characterization, performance property measurements, and stability research. Photo of Technology Evaluation Center This off-network data center provides secure management, storage, and processing
Modular Filter and Source-Management Upgrade of RADAC
NASA Technical Reports Server (NTRS)
Lanzi, R. James; Smith, Donna C.
2007-01-01
In an upgrade of the Range Data Acquisition Computer (RADAC) software, a modular software object library was developed to implement required functionality for filtering of flight-vehicle-tracking data and management of tracking-data sources. (The RADAC software is used to process flight-vehicle metric data for realtime display in the Wallops Flight Facility Range Control Center and Mobile Control Center.)
ERIC Educational Resources Information Center
Ammann, Charles
2010-01-01
For twenty-nine years, Red Clay Consolidated School District has managed data processing in a unique manner. Red Clay participates in the Data Service Center consortium to provide data management and processing services. This consortium is more independent than a department in the district but not as autonomous as an outsourced arrangement. While…
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar ) of the Environmental Modeling Center (EMC) conducts a program of research and development in support Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page Author: EMC
Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop
NASA Astrophysics Data System (ADS)
Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.
2018-04-01
The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.
Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho
2018-01-01
Background Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. Objective The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. Methods This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. Results In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and external users for managing 11,645 studies and 146,943 subjects. Conclusions The CTMS was introduced in the Asan Medical Center to manage the large amounts of data involved with clinical trial operations. Inter- and intraunit control of data and resources can be easily conducted through the CTMS system. To our knowledge, this is the first CTMS developed in-house at an academic medical center side which can enhance the efficiency of clinical trial management in compliance with privacy and security laws. PMID:29691212
Adaptation of a Control Center Development Environment for Industrial Process Control
NASA Technical Reports Server (NTRS)
Killough, Ronnie L.; Malik, James M.
1994-01-01
In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.
The Atmospheric Data Acquisition And Interpolation Process For Center-TRACON Automation System
NASA Technical Reports Server (NTRS)
Jardin, M. R.; Erzberger, H.; Denery, Dallas G. (Technical Monitor)
1995-01-01
The Center-TRACON Automation System (CTAS), an advanced new air traffic automation program, requires knowledge of spatial and temporal atmospheric conditions such as the wind speed and direction, the temperature and the pressure in order to accurately predict aircraft trajectories. Real-time atmospheric data is available in a grid format so that CTAS must interpolate between the grid points to estimate the atmospheric parameter values. The atmospheric data grid is generally not in the same coordinate system as that used by CTAS so that coordinate conversions are required. Both the interpolation and coordinate conversion processes can introduce errors into the atmospheric data and reduce interpolation accuracy. More accurate algorithms may be computationally expensive or may require a prohibitively large amount of data storage capacity so that trade-offs must be made between accuracy and the available computational and data storage resources. The atmospheric data acquisition and processing employed by CTAS will be outlined in this report. The effects of atmospheric data processing on CTAS trajectory prediction will also be analyzed, and several examples of the trajectory prediction process will be given.
"SDI--Where are We? The Challenge of the Future." The Information Dissemination Center View.
ERIC Educational Resources Information Center
Carmon, James L.
The historical and current status of information dissemination centers and the problem of user interface are reviewed. During the past decade, the problems of technical data processing have been conquered; information dissemination has evolved from a loosely knit group of experimental centers to an organization of established centers, many…
Value-added Data Services at the Goddard Earth Sciences Data and Information Services Center
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory G.; Alcott, Gary T.; Kempler, Steven J.; Lynnes, Christopher S.; Vollmer, Bruce E.
2004-01-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC), in addition to serving the Earth Science community as one of the major Distributed Active Archives Centers (DAACs), provides much more than just data. Among the value-added services available to general users are subsetting data spatially and/or by parameter, online analysis (to avoid downloading unnecessarily all the data), and assistance in obtaining data from other centers. Services available to data producers and high-volume users include consulting on building new products with standard formats and metadata and construction of data management systems. A particularly useful service is data processing at the DISC (i.e., close to the input data) with the users algorithm. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools. Partnerships between the GES DISC and scientists, both producers and users, allow the scientists to concentrate on science, while the GES DISC handles the data management, e.g., formats, integration, and data processing. The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from simple data support to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. At the same time, such partnerships allow the GES DISC to serve the user community more efficiently and to better prioritize on-line holdings. Several examples of successful partnerships are described in the presentation.
Romanian Data Center: A modern way for seismic monitoring
NASA Astrophysics Data System (ADS)
Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin
2014-05-01
The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating ShakeMap products and interaction with global data centers. National Data Center developed tools to enable centralizing of data from software like Antelope and Seiscomp3. These tools allow rapid distribution of information about damages observed after an earthquake to the public. Another feature of the developed application is the alerting of designated persons, via email and SMS, based on the earthquake parameters. In parallel, Seiscomp3 sends automatic notifications (emails) with the earthquake parameters. The real-time seismic network and software acquisition and data processing used in the National Data Center development have increased the number of events detected locally and globally, the increase of the quality parameters obtained by data processing and potentially increasing visibility on the national and internationally.
Code of Federal Regulations, 2014 CFR
2014-01-01
... centers or data storage centers which contain records or services that any become lost or inoperative.... Consequently, after reviewing as necessary, the project site, applicable land classification data, or the... analysis and data collection undertaken in the original review process. If the results of the approving...
Code of Federal Regulations, 2013 CFR
2013-01-01
... centers or data storage centers which contain records or services that any become lost or inoperative.... Consequently, after reviewing as necessary, the project site, applicable land classification data, or the... analysis and data collection undertaken in the original review process. If the results of the approving...
Data archiving and network system of Bisei Spaceguard center
NASA Astrophysics Data System (ADS)
Terazono, J.-Y.; Asami, A.; Asher, D.; Hashimoto, N.; Nakano, S.; Nishiyama, K.; Oshima, Y.; Umehara, H.; Urata, T.; Yoshikawa, M.; Isobe, S.
2002-09-01
Bisei Spaceguard Center, Japan's first facility for observations of space debris and Near-Earth Objects (NEOs), will produce large amounts of data. In this paper, we describe details of the data transfer and processing system we are now developing. Also we present a software system devoted to the discovery of asteroids mainly by high school students.
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory G.
2005-01-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is one of the major Distributed Active Archive Centers (DAACs) archiving and distributing remote sensing data from the NASA's Earth Observing System. In addition to providing just data, the GES DISC/DAAC has developed various value-adding processing services. A particularly useful service is data processing a t the DISC (i.e., close to the input data) with the users' algorithms. This can take a number of different forms: as a configuration-managed algorithm within the main processing stream; as a stand-alone program next to the on-line data storage; as build-it-yourself code within the Near-Archive Data Mining (NADM) system; or as an on-the-fly analysis with simple algorithms embedded into the web-based tools (to avoid downloading unnecessary all the data). The existing data management infrastructure at the GES DISC supports a wide spectrum of options: from data subsetting data spatially and/or by parameter to sophisticated on-line analysis tools, producing economies of scale and rapid time-to-deploy. Shifting processing and data management burden from users to the GES DISC, allows scientists to concentrate on science, while the GES DISC handles the data management and data processing at a lower cost. Several examples of successful partnerships with scientists in the area of data processing and mining are presented.
NASA Astrophysics Data System (ADS)
Kurnianto, Ari; Isnanto, Rizal; Widodo, Aris Puji
2018-02-01
Information security is a problem effected business process of an organization, so it needs special concern. Information security assessment which is good and has international standard is done using Information Security Management System (ISMS) ISO/IEC 27001:2013. In this research, the high level assessment has been done using ISO/IEC 27001:2013 to observe the strength of information secuity in Ministry of Internal Affairs. The research explains about the assessment of information security management which is built using PHP. The input data use primary and secondary data which passed observation. The process gets maturity using the assessment of ISO/IEC 27001:2013. GAP Analysis observes the condition now a days and then to get recommendation and road map. The result of this research gets all of the information security process which has not been already good enough in Ministry of Internal Affairs, gives recommendation and road map to improve part of all information system being running. It indicates that ISO/IEC 27001:2013 is good used to rate maturity of information security management. As the next analyzation, this research use Clause and Annex in ISO/IEC 27001:2013 which is suitable with condition of Data Center and Data Recovery Center, so it gets optimum result and solving problem of the weakness information security.
Electrostatic Levitation: A Tool to Support Materials Research in Microgravity
NASA Technical Reports Server (NTRS)
Rogers, Jan; SanSoucie, Mike
2012-01-01
Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.
Water level ingest, archive and processing system - an integral part of NOAA's tsunami database
NASA Astrophysics Data System (ADS)
McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.
2013-12-01
The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.
Fields, Dail; Roman, Paul M; Blum, Terry C
2012-06-01
To examine the relationships among general management systems, patient-focused quality management/continuous process improvement (TQM/CPI) processes, resource availability, and multiple dimensions of substance use disorder (SUD) treatment. Data are from a nationally representative sample of 221 SUD treatment centers through the National Treatment Center Study (NTCS). The design was a cross-sectional field study using latent variable structural equation models. The key variables are management practices, TQM/continuous quality improvement (CQI) practices, resource availability, and treatment center performance. Interviews and questionnaires provided data from treatment center administrative directors and clinical directors in 2007-2008. Patient-focused TQM/CQI practices fully mediated the relationship between internal management practices and performance. The effects of TQM/CQI on performance are significantly larger for treatment centers with higher levels of staff per patient. Internal management practices may create a setting that supports implementation of specific patient-focused practices and protocols inherent to TQM/CQI processes. However, the positive effects of internal management practices on treatment center performance occur through use of specific patient-focused TQM/CPI practices and have more impact when greater amounts of supporting resources are present. © Health Research and Educational Trust.
ERIC Educational Resources Information Center
Association for Educational Data Systems, Washington, DC.
Fifteen papers on computer centers and data processing management presented at the Association for Educational Data Systems (AEDS) 1976 convention are included in this document. The first two papers review the recent controversy for proposed licensing of data processors, and they are followed by a description of the Institute for Certification of…
NASA Technical Reports Server (NTRS)
Mah, G. R.; Myers, J.
1993-01-01
The U.S. Government has initiated the Global Change Research program, a systematic study of the Earth as a complete system. NASA's contribution of the Global Change Research Program is the Earth Observing System (EOS), a series of orbital sensor platforms and an associated data processing and distribution system. The EOS Data and Information System (EOSDIS) is the archiving, production, and distribution system for data collected by the EOS space segment and uses a multilayer architecture for processing, archiving, and distributing EOS data. The first layer consists of the spacecraft ground stations and processing facilities that receive the raw data from the orbiting platforms and then separate the data by individual sensors. The second layer consists of Distributed Active Archive Centers (DAAC) that process, distribute, and archive the sensor data. The third layer consists of a user science processing network. The EOSDIS is being developed in a phased implementation. The initial phase, Version 0, is a prototype of the operational system. Version 0 activities are based upon existing systems and are designed to provide an EOSDIS-like capability for information management and distribution. An important science support task is the creation of simulated data sets for EOS instruments from precursor aircraft or satellite data. The Land Processes DAAC, at the EROS Data Center (EDC), is responsible for archiving and processing EOS precursor data from airborne instruments such as the Thermal Infrared Multispectral Scanner (TIMS), the Thematic Mapper Simulator (TMS), and Airborne Visible and Infrared Imaging Spectrometer (AVIRIS). AVIRIS, TIMS, and TMS are flown by the NASA-Ames Research Center ARC) on an ER-2. The ER-2 flies at 65000 feet and can carry up to three sensors simultaneously. Most jointly collected data sets are somewhat boresighted and roughly registered. The instrument data are being used to construct data sets that simulate the spectral and spatial characteristics of the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument scheduled to be flown on the first EOS-AM spacecraft. The ASTER is designed to acquire 14 channels of land science data in the visible and near-IR (VNIR), shortwave-IR (SWIR), and thermal-IR (TIR) regions from 0.52 micron to 11.65 micron at high spatial resolutions of 15 m to 90 m. Stereo data will also be acquired in the VNIR region in a single band. The AVIRIS and TMS cover the ASTER VNIR and SWIR bands, and the TIMS covers the TIR bands. Simulated ASTER data sets have been generated over Death Valley, California, Cuprite, Nevada, and the Drum Mountains, Utah using a combination of AVIRIS, TIMS, amd TMS data, and existing digital elevation models (DEM) for the topographic information.
The application of automated operations at the Institutional Processing Center
NASA Technical Reports Server (NTRS)
Barr, Thomas H.
1993-01-01
The JPL Institutional and Mission Computing Division, Communications, Computing and Network Services Section, with its mission contractor, OAO Corporation, have for some time been applying automation to the operation of JPL's Information Processing Center (IPC). Automation does not come in one easy to use package. Automation for a data processing center is made up of many different software and hardware products supported by trained personnel. The IPC automation effort formally began with console automation, and has since spiraled out to include production scheduling, data entry, report distribution, online reporting, failure reporting and resolution, documentation, library storage, and operator and user education, while requiring the interaction of multi-vendor and locally developed software. To begin the process, automation goals are determined. Then a team including operations personnel is formed to research and evaluate available options. By acquiring knowledge of current products and those in development, taking an active role in industry organizations, and learning of other data center's experiences, a forecast can be developed as to what direction technology is moving. With IPC management's approval, an implementation plan is developed and resources identified to test or implement new systems. As an example, IPC's new automated data entry system was researched by Data Entry, Production Control, and Advance Planning personnel. A proposal was then submitted to management for review. A determination to implement the new system was made and elements/personnel involved with the initial planning performed the implementation. The final steps of the implementation were educating data entry personnel in the areas effected and procedural changes necessary to the successful operation of the new system.
NASA Astrophysics Data System (ADS)
Lutz, B. J.; Marquis, M.
2001-12-01
The Aqua and ICESat missions are components of the Earth Observing System (EOS). The Advanced Microwave Scanning Radiometer (AMSR-E) instrument will fly on the Aqua satellite planned for launch in Spring 2002. AMSR-E is a passive microwave instrument, modified from the AMSR instrument, which will be deployed on the Japanese Advanced Earth Observing Satellite-II (ADEOS-II). AMSR-E will observe the atmosphere, land, oceans, and cryosphere, yielding measurements of precipitation, cloud water, water vapor, surface wetness, sea surface temperatures, oceanic wind speed, sea ice concentrations, snow depth, and snow water content. The Geoscience Laser Altimeter System (GLAS) instrument will fly aboard the ICESat satellite scheduled for launch in Summer 2002. This instrument will measure ice-sheet topography and temporal changes in topography; cloud heights, planetary boundary heights and aerosol vertical structure; and land and water topography. The GLAS and AMSR-E teams have both chosen to utilize Science Investigator-led Processing Systems (SIPS) to process their respective EOS data products. The SIPS facilities are funded by the Earth Science Data and Information System (ESDIS) Project at NASA's Goddard Space Flight Center and operated under the direction of a science team leader. The SIPS capitalize upon the scientific expertise of the science teams and the distributed processing capabilities of their institutions. The SIPS are charged with routine production of their respective EOS data products for archival at a Distributed Active Archive Center (DAAC). The National Snow and Ice Data Center (NSIDC) DAAC in Boulder, Colorado will archive all AMSR-E and GLAS data products. The NSIDC DAAC will distribute these data products to users throughout the world. The SIPS processing flows of both teams are rather complex. The AMSR-E SIPS is composed of three separate processing facilities (Japan, California, and Alabama). The ICESat SIPS is composed of one main processing center (Maryland) and an important secondary data set processing center (Texas) that generates required auxiliary data products. The EOSDIS Core System (ECS) has developed extensive protocols and procedures to ensure timeliness and completeness of delivery of the data from the SIPS to the DAACs. The NSIDC DAAC, in addition to being the repository of AMSR-E and GLAS data products, provides enhanced services, documentation and guides for these data. NSIDC is a liaison between the science teams and the user community. This poster will display flow diagrams showing: a) the AMSR-E and the ICESat SIPS, and the process of how their Level 1, 2, and 3 data products are generated; b) the staging and delivery of these sets of data to the NSIDC DAAC for archival, and the ECS protocols required to ensure delivery; and c) the services and "value-added" products that the NSIDC DAAC provides to the user community in support of the Aqua (AMSR-E) and ICESat missions.
Firnkorn, D; Ganzinger, M; Muley, T; Thomas, M; Knaup, P
2015-01-01
Joint data analysis is a key requirement in medical research networks. Data are available in heterogeneous formats at each network partner and their harmonization is often rather complex. The objective of our paper is to provide a generic approach for the harmonization process in research networks. We applied the process when harmonizing data from three sites for the Lung Cancer Phenotype Database within the German Center for Lung Research. We developed a spreadsheet-based solution as tool to support the harmonization process for lung cancer data and a data integration procedure based on Talend Open Studio. The harmonization process consists of eight steps describing a systematic approach for defining and reviewing source data elements and standardizing common data elements. The steps for defining common data elements and harmonizing them with local data definitions are repeated until consensus is reached. Application of this process for building the phenotype database led to a common basic data set on lung cancer with 285 structured parameters. The Lung Cancer Phenotype Database was realized as an i2b2 research data warehouse. Data harmonization is a challenging task requiring informatics skills as well as domain knowledge. Our approach facilitates data harmonization by providing guidance through a uniform process that can be applied in a wide range of projects.
School Data Processing Services in Texas. A Cooperative Approach. [Revised.
ERIC Educational Resources Information Center
Texas Education Agency, Austin. Management Information Center.
The Texas plan for computer services provides services to public school districts through a statewide network of 20 regional Education Service Centers (ESC). Each of the three Multi-Regional Processing Centers (MRPCs) operates a large computer facility providing school district services within from three to eight ESC regions; each of the five…
School Data Processing Services in Texas: A Cooperative Approach.
ERIC Educational Resources Information Center
Texas Education Agency, Austin.
The Texas plan for computer services provides services to public school districts through a statewide network of 20 regional Education Service Centers (ESC). Each of the three Multi-Regional Processing Centers (MRPCs) operates a large computer facility providing school district services within from three to eight ESC regions; each of the five…
School Data Processing Services in Texas: A Cooperative Approach.
ERIC Educational Resources Information Center
Texas Education Agency, Austin.
The Texas plan for computer services provides services to public school districts through a statewide network of 20 regional Education Service Centers (ESO). Each of the three Multi-Regional Processing Centers (MRPCs) operates a large computer facility providing school district services within from three to eight ESC regions each of the five…
The Kepler Science Operations Center Pipeline Framework Extensions
NASA Technical Reports Server (NTRS)
Klaus, Todd C.; Cote, Miles T.; McCauliff, Sean; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Chandrasekaran, Hema; Bryson, Stephen T.; Middour, Christopher; Caldwell, Douglas A.;
2010-01-01
The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline.
75 FR 34452 - Center for Drug Evaluation and Research Data Standards Plan; Availability for Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-17
... identifies key objectives for a data standards program at CDER, processes to be developed to ensure... public health mission. At present, the lack of standardized data affects CDER's review processes by...-emerging issues. Standardization of data submissions, a requirement for electronic submissions, and a...
A source-controlled data center network model.
Yu, Yang; Liang, Mangui; Wang, Zhe
2017-01-01
The construction of data center network by applying SDN technology has become a hot research topic. The SDN architecture has innovatively separated the control plane from the data plane which makes the network more software-oriented and agile. Moreover, it provides virtual multi-tenancy, effective scheduling resources and centralized control strategies to meet the demand for cloud computing data center. However, the explosion of network information is facing severe challenges for SDN controller. The flow storage and lookup mechanisms based on TCAM device have led to the restriction of scalability, high cost and energy consumption. In view of this, a source-controlled data center network (SCDCN) model is proposed herein. The SCDCN model applies a new type of source routing address named the vector address (VA) as the packet-switching label. The VA completely defines the communication path and the data forwarding process can be finished solely relying on VA. There are four advantages in the SCDCN architecture. 1) The model adopts hierarchical multi-controllers and abstracts large-scale data center network into some small network domains that has solved the restriction for the processing ability of single controller and reduced the computational complexity. 2) Vector switches (VS) developed in the core network no longer apply TCAM for table storage and lookup that has significantly cut down the cost and complexity for switches. Meanwhile, the problem of scalability can be solved effectively. 3) The SCDCN model simplifies the establishment process for new flows and there is no need to download flow tables to VS. The amount of control signaling consumed when establishing new flows can be significantly decreased. 4) We design the VS on the NetFPGA platform. The statistical results show that the hardware resource consumption in a VS is about 27% of that in an OFS.
A source-controlled data center network model
Yu, Yang; Liang, Mangui; Wang, Zhe
2017-01-01
The construction of data center network by applying SDN technology has become a hot research topic. The SDN architecture has innovatively separated the control plane from the data plane which makes the network more software-oriented and agile. Moreover, it provides virtual multi-tenancy, effective scheduling resources and centralized control strategies to meet the demand for cloud computing data center. However, the explosion of network information is facing severe challenges for SDN controller. The flow storage and lookup mechanisms based on TCAM device have led to the restriction of scalability, high cost and energy consumption. In view of this, a source-controlled data center network (SCDCN) model is proposed herein. The SCDCN model applies a new type of source routing address named the vector address (VA) as the packet-switching label. The VA completely defines the communication path and the data forwarding process can be finished solely relying on VA. There are four advantages in the SCDCN architecture. 1) The model adopts hierarchical multi-controllers and abstracts large-scale data center network into some small network domains that has solved the restriction for the processing ability of single controller and reduced the computational complexity. 2) Vector switches (VS) developed in the core network no longer apply TCAM for table storage and lookup that has significantly cut down the cost and complexity for switches. Meanwhile, the problem of scalability can be solved effectively. 3) The SCDCN model simplifies the establishment process for new flows and there is no need to download flow tables to VS. The amount of control signaling consumed when establishing new flows can be significantly decreased. 4) We design the VS on the NetFPGA platform. The statistical results show that the hardware resource consumption in a VS is about 27% of that in an OFS. PMID:28328925
Implementing EVM Data Analysis Adding Value from a NASA Project Manager's Perspective
NASA Technical Reports Server (NTRS)
Counts, Stacy; Kerby, Jerald
2006-01-01
Data Analysis is one of the keys to an effective Earned Value Management (EVM) Process. Project Managers (PM) must continually evaluate data in assessing the health of their projects. Good analysis of data can assist PMs in making better decisions in managing projects. To better support our P Ms, National Aeronautics and Space Administration (NASA) - Marshall Space Flight Center (MSFC) recently renewed its emphasis on sound EVM data analysis practices and processes, During this presentation we will discuss the approach that MSFC followed in implementing better data analysis across its Center. We will address our approach to effectively equip and support our projects in applying a sound data analysis process. In addition, the PM for the Space Station Biological Research Project will share her experiences of how effective data analysis can benefit a PM in the decision making process. The PM will discuss how the emphasis on data analysis has helped create a solid method for assessing the project s performance. Using data analysis successfully can be an effective and efficient tool in today s environment with increasing workloads and downsizing workforces
The Willowbrook futures project: a longitudinal analysis of person-centered planning.
Holburn, Steve; Jacobson, John W; Schwartz, Allen A; Flory, Michael J; Vietze, Peter M
2004-01-01
We conducted a longitudinal comparative evaluation of person-centered planning processes and outcomes for 20 individuals with intellectual disabilities and problem behavior (former residents of Willowbrook) and a matched contrast group, who received traditional interdisciplinary service planning (ISP). At the inception of the study, all participants were living in one of four other developmental centers (institutions) in New York City. Process and outcome data obtained from questionnaires completed by team members approximately every 8 months at four time periods showed that the rate of improvement in both person-centered planning process and outcomes for the intervention group was significantly greater than that of the comparison group. Eighteen of 19 person-centered planning participants moved to community living arrangements, as did 5 of 18 in the contrast group.
2012-02-09
The calibrated data are then sent to NRL Stennis Space Center (NRL-SSC) for further processing using the NRL SSC Automated Processing System (APS...hyperspectral sensor in space we have not previously developed automated processing for hyperspectral ocean color data. The hyperspectral processing branch
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.
2011-12-01
Services that preserve and enable future access to scientific data are necessary to ensure that the data that are being collected today will be available for use by future generations of scientists. Many data centers, archives, and other digital repositories are working to improve their ability to serve as long-term stewards of scientific data. Trust in sustainable data management and preservation capabilities of digital repositories can influence decisions to use these services to deposit or obtain scientific data. Building on the Open Archival Information System (OAIS) Reference Model developed by the Consultative Committee for Space Data Systems (CCSDS) and adopted by the International Organization for Standardization as ISO 14721:2003, new standards are being developed to improve long-term data management processes and documentation. The Draft Information Standard ISO/DIS 16363, "Space data and information transfer systems - Audit and certification of trustworthy digital repositories" offers the potential to evaluate digital repositories objectively in terms of their trustworthiness as long-term stewards of digital resources. In conjunction with this, the CCSDS and ISO are developing another draft standard for the auditing and certification process, ISO/DIS 16919, "Space data and information transfer systems - Requirements for bodies providing audit and certification of candidate trustworthy digital repositories". Six test audits were conducted of scientific data centers and archives in Europe and the United States to test the use of these draft standards and identify potential improvements for the standards and for the participating digital repositories. We present a case study of the test audit conducted on the NASA Socioeconomic Data and Applications Center (SEDAC) and describe the preparation, the audit process, recommendations received, and next steps to obtain certification as a trustworthy digital repository, after approval of the ISO/DIS standards.
Data Visualization and Animation Lab (DVAL) overview
NASA Technical Reports Server (NTRS)
Stacy, Kathy; Vonofenheim, Bill
1994-01-01
The general capabilities of the Langley Research Center Data Visualization and Animation Laboratory is described. These capabilities include digital image processing, 3-D interactive computer graphics, data visualization and analysis, video-rate acquisition and processing of video images, photo-realistic modeling and animation, video report generation, and color hardcopies. A specialized video image processing system is also discussed.
NASA Technical Reports Server (NTRS)
Davis, Bruce E.; Elliot, Gregory
1989-01-01
Jackson State University recently established the Center for Spatial Data Research and Applications, a Geographical Information System (GIS) and remote sensing laboratory. Taking advantage of new technologies and new directions in the spatial (geographic) sciences, JSU is building a Center of Excellence in Spatial Data Management. New opportunities for research, applications, and employment are emerging. GIS requires fundamental shifts and new demands in traditional computer science and geographic training. The Center is not merely another computer lab but is one setting the pace in a new applied frontier. GIS and its associated technologies are discussed. The Center's facilities are described. An ARC/INFO GIS runs on a Vax mainframe, with numerous workstations. Image processing packages include ELAS, LIPS, VICAR, and ERDAS. A host of hardware and software peripheral are used in support. Numerous projects are underway, such as the construction of a Gulf of Mexico environmental data base, development of AI in image processing, a land use dynamics study of metropolitan Jackson, and others. A new academic interdisciplinary program in Spatial Data Management is under development, combining courses in Geography and Computer Science. The broad range of JSU's GIS and remote sensing activities is addressed. The impacts on changing paradigms in the university and in the professional world conclude the discussion.
SAR processing using SHARC signal processing systems
NASA Astrophysics Data System (ADS)
Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.
1998-09-01
Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.
Global satellite composites - 20 years of evolution
NASA Astrophysics Data System (ADS)
Kohrs, Richard A.; Lazzara, Matthew A.; Robaidek, Jerrold O.; Santek, David A.; Knuth, Shelley L.
2014-01-01
For two decades, the University of Wisconsin Space Science and Engineering Center (SSEC) and the Antarctic Meteorological Research Center (AMRC) have been creating global, regional and hemispheric satellite composites. These composites have proven useful in research, operational forecasting, commercial applications and educational outreach. Using the Man computer Interactive Data System (McIDAS) software developed at SSEC, infrared window composites were created by combining Geostationary Operational Environmental Satellite (GOES), and polar orbiting data from the SSEC Data Center and polar data acquired at McMurdo and Palmer stations, Antarctica. Increased computer processing speed has allowed for more advanced algorithms to address the decision making process for co-located pixels. The algorithms have evolved from a simplistic maximum brightness temperature to those that account for distance from the sub-satellite point, parallax displacement, pixel time and resolution. The composites are the state-of-the-art means for merging/mosaicking satellite imagery.
DaCHS: Data Center Helper Suite
NASA Astrophysics Data System (ADS)
Demleitner, Markus
2018-04-01
DaCHS, the Data Center Helper Suite, is an integrated package for publishing astronomical data sets to the Virtual Observatory. Network-facing, it speaks the major VO protocols (SCS, SIAP, SSAP, TAP, Datalink, etc). Operator-facing, many input formats, including FITS/WCS, ASCII files, and VOTable, can be processed to publication-ready data. DaCHS puts particular emphasis on integrated metadata handling, which facilitates a tight integration with the VO's Registry
Data Serving Climate Simulation Science at the NASA Center for Climate Simulation
NASA Technical Reports Server (NTRS)
Salmon, Ellen M.
2011-01-01
The NASA Center for Climate Simulation (NCCS) provides high performance computational resources, a multi-petabyte archive, and data services in support of climate simulation research and other NASA-sponsored science. This talk describes the NCCS's data-centric architecture and processing, which are evolving in anticipation of researchers' growing requirements for higher resolution simulations and increased data sharing among NCCS users and the external science community.
Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion
NASA Astrophysics Data System (ADS)
Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger
2007-12-01
Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.
Dual-Use Space Technology Transfer Conference and Exhibition. Volume 2
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Compiler)
1994-01-01
This is the second volume of papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools; systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development; perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; and robotics technologies.
NASA Astrophysics Data System (ADS)
Jedlovec, G.; McGrath, K.; Meyer, P. J.; Berndt, E.
2017-12-01
A GOES-R series receiving station has been installed at the NASA Marshall Space Flight Center (MSFC) to support GOES-16 transition-to-operations projects of NASA's Earth science program and provide a community portal for GOES-16 data access. This receiving station is comprised of a 6.5-meter dish; motor-driven positioners; Quorum feed and demodulator; and three Linux workstations for ingest, processing, display, and subsequent product generation. The Community Satellite Processing Package (CSPP) is used to process GOES Rebroadcast data from the Advanced Baseline Imager (ABI), Geostationary Lightning Mapper (GLM), Solar Ultraviolet Imager (SUVI), Extreme Ultraviolet and X-ray Irradiance Sensors (EXIS), and Space Environment In-Situ Suite (SEISS) into Level 1b and Level 2 files. GeoTIFFs of the imagery from several of these instruments are ingested into an Esri Arc Enterprise Web Map Service (WMS) server with tiled imagery displayable through a web browser interface or by connecting directly to the WMS with a Geographic Information System software package. These data also drive a basic web interface where users can manually zoom to and animate regions of interest or acquire similar results using a published Application Program Interface. While not as interactive as a WMS-driven interface, this system is much more expeditious with generating and distributing requested imagery. The legacy web capability enacted for the predecessor GOES Imager currently supports approximately 500,000 unique visitors each month. Dissemination capabilities have been refined to support a significantly larger number of anticipated users. The receiving station also supports NASA's Short-term Prediction, Research, and Transition Center's (SPoRT) project activities to dissemination near real-time ABI RGB products to National Weather Service National Centers, including the Satellite Analysis Branch, National Hurricane Center, Ocean Prediction Center, and Weather Prediction Center, where they are displayed in N-AWIPS and AWIPS II. The multitude of additional real-time data users include the U.S. Coast Guard, Federal Aviation Administration, and The Weather Company. A second antenna is being installed for the ingest, processing, and dissemination of GOES-S data.
Medical Waste Management in Community Health Centers.
Tabrizi, Jafar Sadegh; Rezapour, Ramin; Saadati, Mohammad; Seifi, Samira; Amini, Behnam; Varmazyar, Farahnaz
2018-02-01
Non-standard management of medical waste leads to irreparable side effects. This issue is of double importance in health care centers in a city which are the most extensive system for providing Primary Health Care (PHC) across Iran cities. This study investigated the medical waste management standards observation in Tabriz community health care centers, northwestern Iran. In this triangulated cross-sectional study (qualitative-quantitative), data collecting tool was a valid checklist of waste management process developed based on Iranian medical waste management standards. The data were collected in 2015 through process observation and interviews with the health center's staff. The average rate of waste management standards observance in Tabriz community health centers, Tabriz, Iran was 29.8%. This case was 22.8% in dimension of management and training, 27.3% in separating and collecting, 31.2% in transport and temporary storage, and 42.9% in sterilization and disposal. Lack of principal separation of wastes, inappropriate collecting and disposal cycle of waste and disregarding safety tips (fertilizer device performance monitoring, microbial cultures and so on) were among the observed defects in health care centers supported by quantitative data. Medical waste management was not in a desirable situation in Tabriz community health centers. The expansion of community health centers in different regions and non-observance of standards could predispose to incidence the risks resulted from medical wastes. So it is necessary to adopt appropriate policies to promote waste management situation.
NASA Occupational Health Program FY98 Self-Assessment
NASA Technical Reports Server (NTRS)
Brisbin, Steven G.
1999-01-01
The NASA Functional Management Review process requires that each NASA Center conduct self-assessments of each functional area. Self-Assessments were completed in June 1998 and results were presented during this conference session. During FY 97 NASA Occupational Health Assessment Team activities, a decision was made to refine the NASA Self-Assessment Process. NASA Centers were involved in the ISO registration process at that time and wanted to use the management systems approach to evaluate their occupational health programs. This approach appeared to be more consistent with NASA's management philosophy and would likely confer status needed by Senior Agency Management for the program. During FY 98 the Agency Occupational Health Program Office developed a revised self-assessment methodology based on the Occupational Health and Safety Management System developed by the American Industrial Hygiene Association. This process was distributed to NASA Centers in March 1998 and completed in June 1998. The Center Self Assessment data will provide an essential baseline on the status of OHP management processes at NASA Centers. That baseline will be presented to Enterprise Associate Administrators and DASHO on September 22, 1998 and used as a basis for discussion during FY 99 visits to NASA Centers. The process surfaced several key management system elements warranting further support from the Lead Center. Input and feedback from NASA Centers will be essential to defining and refining future self assessment efforts.
Centering Students in School-Based Support Processes: Critical Inquiries and Shifting Perspectives
ERIC Educational Resources Information Center
Brion-Meisels, Gretchen
2015-01-01
Drawing on data from two qualitative studies, this chapter argues that both school organizations and individual students will benefit from centering youth voices in student support systems. To do this, the author shares data from adolescents' narratives that demonstrate how young people's voices might (re)shape the central practices of…
A study of spatial data management and analysis systems
NASA Technical Reports Server (NTRS)
Christopher, Clyde; Galle, Richard
1989-01-01
The Earth Resources Laboratory of the NASA Stennis Space Center is a center of space related technology for Earth observations. It has assumed the task, in a joint effort with Jackson State University, to reach out to the science community and acquire information pertaining to characteristics of spatially oriented data processing.
A framework of space weather satellite data pipeline
NASA Astrophysics Data System (ADS)
Ma, Fuli; Zou, Ziming
Various applications indicate a need of permanent space weather information. The diversity of available instruments enables a big variety of products. As an indispensable part of space weather satellite operation system, space weather data processing system is more complicated than before. The information handled by the data processing system has been used in more and more fields such as space weather monitoring and space weather prediction models. In the past few years, many satellites have been launched by China. The data volume downlinked by these satellites has achieved the so-called big data level and it will continue to grow fast in the next few years due to the implementation of many new space weather programs. Because of the huge amount of data, the current infrastructure is no longer incapable of processing data timely, so we proposed a new space weather data processing system (SWDPS) based on the architecture of cloud computing. Similar to Hadoop, SWDPS decomposes the tasks into smaller tasks which will be executed by many different work nodes. Control Center in SWDPS, just like NameNode and JobTracker within Hadoop which is the bond between the data and the cluster, will establish work plan for the cluster once a client submits data. Control Center will allocate node for the tasks and the monitor the status of all tasks. As the same of TaskTrakcer, Compute Nodes in SWDPS are the salves of Control Center which are responsible for calling the plugins(e.g., dividing and sorting plugins) to execute the concrete jobs. They will also manage all the tasks’ status and report them to Control Center. Once a task fails, a Compute Node will notify Control Center. Control Center decides what to do then; it may resubmit the job elsewhere, it may mark that specific record as something to avoid, and it may even blacklist the Compute Node as unreliable. In addition to these modules, SWDPS has a different module named Data Service which is used to provide file operations such as adding, deleting, modifying and querying for the clients. Beyond that Data Service can also split and combine files based on the timestamp of each record. SWDPS has been used for quite some time and it has been successfully dealt with many satellites, such as FY1C, FY1D, FY2A, FY2B, etc. The good performance in actual operation shows that SWDPS is stable and reliable.
National Centers for Environmental Prediction
Processing Land Surface Software Engineering Hurricanes Model Information Documentation Performance Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Series Other Information Collaborators In-House Website Transition to Operations Presentations
2005-06-01
cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,
Landsat 7 Science Data Processing: An Overview
NASA Technical Reports Server (NTRS)
Schweiss, Robert J.; Daniel, Nathaniel E.; Derrick, Deborah K.
2000-01-01
The Landsat 7 Science Data Processing System, developed by NASA for the Landsat 7 Project, provides the science data handling infrastructure used at the Earth Resources Observation Systems (EROS) Data Center (EDC) Landsat Data Handling Facility (DHF) of the United States Department of Interior, United States Geological Survey (USGS) located in Sioux Falls, South Dakota. This paper presents an overview of the Landsat 7 Science Data Processing System and details of the design, architecture, concept of operation, and management aspects of systems used in the processing of the Landsat 7 Science Data.
Dou, Chao
2016-01-01
The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always “dirty,” which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the “dirty” data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. PMID:28090205
Miao, Beibei; Dou, Chao; Jin, Xuebo
2016-01-01
The storage volume of internet data center is one of the classical time series. It is very valuable to predict the storage volume of a data center for the business value. However, the storage volume series from a data center is always "dirty," which contains the noise, missing data, and outliers, so it is necessary to extract the main trend of storage volume series for the future prediction processing. In this paper, we propose an irregular sampling estimation method to extract the main trend of the time series, in which the Kalman filter is used to remove the "dirty" data; then the cubic spline interpolation and average method are used to reconstruct the main trend. The developed method is applied in the storage volume series of internet data center. The experiment results show that the developed method can estimate the main trend of storage volume series accurately and make great contribution to predict the future volume value. .
Center of Excellence in Space Data and Information Science, Year 9
NASA Technical Reports Server (NTRS)
Yesha, Yelena
1997-01-01
This report summarizes the range of computer science related activities undertaken by CESDIS(Center of Excellence in Space Data and Information Sciences) for NASA in the twelve months from July 1, 1996 through June 30, 1997. These activities address issues related to accessing, processing, and analyzing data from space observing systems through collaborative efforts with university, industry, and NASA space and Earth scientists.
Park, Yu Rang; Yoon, Young Jo; Koo, HaYeong; Yoo, Soyoung; Choi, Chang-Min; Beck, Sung-Ho; Kim, Tae Won
2018-04-24
Clinical trials pose potential risks in both communications and management due to the various stakeholders involved when performing clinical trials. The academic medical center has a responsibility and obligation to conduct and manage clinical trials while maintaining a sufficiently high level of quality, therefore it is necessary to build an information technology system to support standardized clinical trial processes and comply with relevant regulations. The objective of the study was to address the challenges identified while performing clinical trials at an academic medical center, Asan Medical Center (AMC) in Korea, by developing and utilizing a clinical trial management system (CTMS) that complies with standardized processes from multiple departments or units, controlled vocabularies, security, and privacy regulations. This study describes the methods, considerations, and recommendations for the development and utilization of the CTMS as a consolidated research database in an academic medical center. A task force was formed to define and standardize the clinical trial performance process at the site level. On the basis of the agreed standardized process, the CTMS was designed and developed as an all-in-one system complying with privacy and security regulations. In this study, the processes and standard mapped vocabularies of a clinical trial were established at the academic medical center. On the basis of these processes and vocabularies, a CTMS was built which interfaces with the existing trial systems such as the electronic institutional review board health information system, enterprise resource planning, and the barcode system. To protect patient data, the CTMS implements data governance and access rules, and excludes 21 personal health identifiers according to the Health Insurance Portability and Accountability Act (HIPAA) privacy rule and Korean privacy laws. Since December 2014, the CTMS has been successfully implemented and used by 881 internal and external users for managing 11,645 studies and 146,943 subjects. The CTMS was introduced in the Asan Medical Center to manage the large amounts of data involved with clinical trial operations. Inter- and intraunit control of data and resources can be easily conducted through the CTMS system. To our knowledge, this is the first CTMS developed in-house at an academic medical center side which can enhance the efficiency of clinical trial management in compliance with privacy and security laws. ©Yu Rang Park, Young Jo Yoon, HaYeong Koo, Soyoung Yoo, Chang-Min Choi, Sung-Ho Beck, Tae Won Kim. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.04.2018.
NASA Astrophysics Data System (ADS)
Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi
2018-02-01
The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.
Taking advantage of ground data systems attributes to achieve quality results in testing software
NASA Technical Reports Server (NTRS)
Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.
1994-01-01
During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.
Experiments with Analytic Centers: A confluence of data, tools and help in using them.
NASA Astrophysics Data System (ADS)
Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.
2017-12-01
Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.
Distributing Data to Hand-Held Devices in a Wireless Network
NASA Technical Reports Server (NTRS)
Hodges, Mark; Simmons, Layne
2008-01-01
ADROIT is a developmental computer program for real-time distribution of complex data streams for display on Web-enabled, portable terminals held by members of an operational team of a spacecraft-command-and-control center who may be located away from the center. Examples of such terminals include personal data assistants, laptop computers, and cellular telephones. ADROIT would make it unnecessary to equip each terminal with platform- specific software for access to the data streams or with software that implements the information-sharing protocol used to deliver telemetry data to clients in the center. ADROIT is a combination of middleware plus software specific to the center. (Middleware enables one application program to communicate with another by performing such functions as conversion, translation, consolidation, and/or integration.) ADROIT translates a data stream (voice, video, or alphanumerical data) from the center into Extensible Markup Language, effectuates a subscription process to determine who gets what data when, and presents the data to each user in real time. Thus, ADROIT is expected to enable distribution of operations and to reduce the cost of operations by reducing the number of persons required to be in the center.
Processing of Global Area Coverage (GAC) Data of the TIROS-N/NOAA Series Polar Orbiters.
1984-10-01
National Climatic Data Center as tape copies that generally contain calibration information. In order to process the data on the SPADS , the data must be...the SPADS Eclipse S250, for maintenance of the software and for understanding data formats as well as the techniques involved in processing the GAC...constructive response will be appreciated. * 2. The Raw Data 2.1 How to Order Data Everybody working for the SPAD should contact the department head to
Expert panel reviews of research centers: the site visit process.
Lawrenz, Frances; Thao, Mao; Johnson, Kelli
2012-08-01
Site visits are used extensively in a variety of settings within the evaluation community. They are especially common in making summative value decisions about the quality and worth of research programs/centers. However, there has been little empirical research and guidance about how to appropriately conduct evaluative site visits of research centers. We review the processes of two site visit examples using an expert panel review: (1) a process to evaluate four university research centers and (2) a process to review a federally sponsored research center. A set of 14 categories describing the expert panel review process was obtained through content analysis and participant observation. Most categories were addressed differently through the two processes highlighting the need for more research about the most effective processes to use within different contexts. Decisions about how to structure site visits appear to depend on the research context, practical considerations, the level at which the review is being conducted and the intended impact of the report. Future research pertaining to the selection of site visitors, the autonomy of the visitors in data collection and report writing, and the amount and type of information provided would be particularly valuable. Copyright © 2012 Elsevier Ltd. All rights reserved.
Developing processing techniques for Skylab data
NASA Technical Reports Server (NTRS)
Nalepka, R. F. (Principal Investigator); Malila, W. A.; Morgenstern, J. P.
1975-01-01
The author has identified the following significant results. The effects of misregistration and the scan-line-straightening algorithm on multispectral data were found to be: (1) there is greatly increased misregistration in scan-line-straightening data over conic data; (2) scanner caused misregistration between any pairs of channels may not be corrected for in scan-line-straightened data; and (3) this data will have few pure field center pixels than will conic data. A program SIMSIG was developed implementing the signature simulation model. Data processing stages of the experiment were carried out, and an analysis was made of the effects of spatial misregistration on field center classification accuracy. Fifteen signatures originally used for classifying the data were analyzed, showing the following breakdown: corn (4 signatures), trees (2), brush (1), grasses, weeds, etc. (5), bare soil (1), soybeans (1), and alfalfa (1).
Pechacek, Judith; Shanedling, Janet; Lutfiyya, May Nawal; Brandt, Barbara F; Cerra, Frank B; Delaney, Connie White
2015-01-01
Understanding the impact that interprofessional education and collaborative practice (IPECP) might have on triple aim patient outcomes is of high interest to health care providers, educators, administrators, and policy makers. Before the work undertaken by the National Center for Interprofessional Practice and Education at the University of Minnesota, no standard mechanism to acquire and report outcome data related to interprofessional education and collaborative practice and its effect on triple aim outcomes existed. This article describes the development and adoption of the National Center Data Repository (NCDR) designed to capture data related to IPECP processes and outcomes to support analyses of the relationship of IPECP on the Triple Aim. The data collection methods, web-based survey design and implementation process are discussed. The implications of this informatics work to the field of IPECP and health care quality and safety include creating standardized capacity to describe interprofessional practice and measure outcomes connecting interprofessional education and collaborative practice to the triple aim within and across sites/settings, leveraging an accessible data collection process using user friendly web-based survey design to support large data scholarship and instrument testing, and establishing standardized data elements and variables that can potentially lead to enhancements to national/international information system and academic accreditation standards to further team-based, interprofessional, collaborative research in the field.
Pechacek, Judith; Shanedling, Janet; Lutfiyya, May Nawal; Brandt, Barbara F.; Cerra, Frank B.; Delaney, Connie White
2015-01-01
Abstract Understanding the impact that interprofessional education and collaborative practice (IPECP) might have on triple aim patient outcomes is of high interest to health care providers, educators, administrators, and policy makers. Before the work undertaken by the National Center for Interprofessional Practice and Education at the University of Minnesota, no standard mechanism to acquire and report outcome data related to interprofessional education and collaborative practice and its effect on triple aim outcomes existed. This article describes the development and adoption of the National Center Data Repository (NCDR) designed to capture data related to IPECP processes and outcomes to support analyses of the relationship of IPECP on the Triple Aim. The data collection methods, web-based survey design and implementation process are discussed. The implications of this informatics work to the field of IPECP and health care quality and safety include creating standardized capacity to describe interprofessional practice and measure outcomes connecting interprofessional education and collaborative practice to the triple aim within and across sites/settings, leveraging an accessible data collection process using user friendly web-based survey design to support large data scholarship and instrument testing, and establishing standardized data elements and variables that can potentially lead to enhancements to national/international information system and academic accreditation standards to further team-based, interprofessional, collaborative research in the field. PMID:26652631
An Approach to Data Center-Based KDD of Remote Sensing Datasets
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Mack, Robert; Wharton, Stephen W. (Technical Monitor)
2001-01-01
The data explosion in remote sensing is straining the ability of data centers to deliver the data to the user community, yet many large-volume users actually seek a relatively small information component within the data, which they extract at their sites using Knowledge Discovery in Databases (KDD) techniques. To improve the efficiency of this process, the Goddard Earth Sciences Distributed Active Archive Center (GES DAAC) has implemented a KDD subsystem that supports execution of the user's KDD algorithm at the data center, dramatically reducing the volume that is sent to the user. The data are extracted from the archive in a planned, organized "campaign"; the algorithms are executed, and the output products sent to the users over the network. The first campaign, now complete, has resulted in overall reductions in shipped volume from 3.3 TB to 0.4 TB.
California State Library: Processing Center Design and Specifications. Volume III, Coding Manual.
ERIC Educational Resources Information Center
Sherman, Don; Shoffner, Ralph M.
As part of the report on the California State Library Processing Center design and specifications, this volume is a coding manual for the conversion of catalog card data to a machine-readable form. The form is compatible with the national MARC system, while at the same time it contains provisions for problems peculiar to the local situation. This…
NASA Technical Reports Server (NTRS)
Behnke, Jeanne; Doescher, Chris
2015-01-01
This presentation discusses 25 years of interactions between NASA and the USGS to manage a Land Processes Distributed Active Archive Center (LPDAAC) for the purpose of providing users access to NASA's rich collection of Earth Science data. The presentation addresses challenges, efforts and metrics on the performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darrow, Ken; Hedman, Bruce
Data centers represent a rapidly growing and very energy intensive activity in commercial, educational, and government facilities. In the last five years the growth of this sector was the electric power equivalent to seven new coal-fired power plants. Data centers consume 1.5% of the total power in the U.S. Growth over the next five to ten years is expected to require a similar increase in power generation. This energy consumption is concentrated in buildings that are 10-40 times more energy intensive than a typical office building. The sheer size of the market, the concentrated energy consumption per facility, and themore » tendency of facilities to cluster in 'high-tech' centers all contribute to a potential power infrastructure crisis for the industry. Meeting the energy needs of data centers is a moving target. Computing power is advancing rapidly, which reduces the energy requirements for data centers. A lot of work is going into improving the computing power of servers and other processing equipment. However, this increase in computing power is increasing the power densities of this equipment. While fewer pieces of equipment may be needed to meet a given data processing load, the energy density of a facility designed to house this higher efficiency equipment will be as high as or higher than it is today. In other words, while the data center of the future may have the IT power of ten data centers of today, it is also going to have higher power requirements and higher power densities. This report analyzes the opportunities for CHP technologies to assist primary power in making the data center more cost-effective and energy efficient. Broader application of CHP will lower the demand for electricity from central stations and reduce the pressure on electric transmission and distribution infrastructure. This report is organized into the following sections: (1) Data Center Market Segmentation--the description of the overall size of the market, the size and types of facilities involved, and the geographic distribution. (2) Data Center Energy Use Trends--a discussion of energy use and expected energy growth and the typical energy consumption and uses in data centers. (3) CHP Applicability--Potential configurations, CHP case studies, applicable equipment, heat recovery opportunities (cooling), cost and performance benchmarks, and power reliability benefits (4) CHP Drivers and Hurdles--evaluation of user benefits, social benefits, market structural issues and attitudes toward CHP, and regulatory hurdles. (5) CHP Paths to Market--Discussion of technical needs, education, strategic partnerships needed to promote CHP in the IT community.« less
Drowning in Data: Going Beyond Traditional Data Archival to Educate Data Users
NASA Astrophysics Data System (ADS)
Weigel, A. M.; Smith, T.; Smith, D. K.; Bugbee, K.; Sinclair, L.
2017-12-01
Increasing quantities of Earth science data and information prove overwhelming to new and unfamiliar users. Data discovery and use challenges faced by these users are compounded with atmospheric science field campaign data collected by a variety of instruments and stored, visualized, processed and analyzed in different ways. To address data and user needs assessed through annual surveys and user questions, the NASA Global Hydrology Resource Center Distributed Active Archive Center (GHRC DAAC), in collaboration with a graphic designer, has developed a series of resources to help users learn about GHRC science focus areas, field campaigns, instruments, data, and data processing techniques. In this talk, GHRC data recipes, micro articles, interactive data visualization techniques, and artistic science outreach and education efforts, such as ESRI story maps and research as art, will be overviewed. The objective of this talk is to stress the importance artistic information visualization has in communicating with and educating Earth science data users.
Dual-Use Space Technology Transfer Conference and Exhibition. Volume 1
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Compiler)
1994-01-01
This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry.
2014-09-25
CAPE CANAVERAL, Fla. – Coupled Florida East Coast Railway, or FEC, locomotives No. 433 and No. 428 make the first run past the Orbiter Processing Facility and Thermal Protection System Facility in Launch Complex 39 at NASA’s Kennedy Space Center in Florida during the Rail Vibration Test for the Canaveral Port Authority. Seismic monitors are collecting data as the train passes by. The purpose of the test is to collect amplitude, frequency and vibration test data utilizing two Florida East Coast locomotives operating on KSC tracks to ensure that future railroad operations will not affect launch vehicle processing at the center. Buildings instrumented for the test include the Rotation Processing Surge Facility, Thermal Protection Systems Facility, Vehicle Assembly Building, Orbiter Processing Facility and Booster Fabrication Facility. Photo credit: NASA/Daniel Casper
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2012-01-01
This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown
Overview of NASA MSFC IEC Federated Engineering Collaboration Capability
NASA Technical Reports Server (NTRS)
Moushon, Brian; McDuffee, Patrick
2005-01-01
The MSFC IEC federated engineering framework is currently developing a single collaborative engineering framework across independent NASA centers. The federated approach allows NASA centers the ability to maintain diversity and uniqueness, while providing interoperability. These systems are integrated together in a federated framework without compromising individual center capabilities. MSFC IEC's Federation Framework will have a direct affect on how engineering data is managed across the Agency. The approach is directly attributed in response to the Columbia Accident Investigation Board (CAB) finding F7.4-11 which states the Space Shuttle Program has a wealth of data sucked away in multiple databases without a convenient way to integrate and use the data for management, engineering, or safety decisions. IEC s federated capability is further supported by OneNASA recommendation 6 that identifies the need to enhance cross-Agency collaboration by putting in place common engineering and collaborative tools and databases, processes, and knowledge-sharing structures. MSFC's IEC Federated Framework is loosely connected to other engineering applications that can provide users with the integration needed to achieve an Agency view of the entire product definition and development process, while allowing work to be distributed across NASA Centers and contractors. The IEC DDMS federation framework eliminates the need to develop a single, enterprise-wide data model, where the goal of having a common data model shared between NASA centers and contractors is very difficult to achieve.
Kalvelage, Thomas A.; Willems, Jennifer
2005-01-01
The US Geological Survey's EROS Data Center (EDC) hosts the Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC supports NASA's Earth Observing System (EOS), which is a series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. The EOS Data and Information Systems (EOSDIS) was designed to acquire, archive, manage and distribute Earth observation data to the broadest possible user community.The LP DAAC is one of four DAACs that utilize the EOSDIS Core System (ECS) to manage and archive their data. Since the ECS was originally designed, significant changes have taken place in technology, user expectations, and user requirements. Therefore the LP DAAC has implemented additional systems to meet the evolving needs of scientific users, tailored to an integrated working environment. These systems provide a wide variety of services to improve data access and to enhance data usability through subsampling, reformatting, and reprojection. These systems also support the wide breadth of products that are handled by the LP DAAC.The LP DAAC is the primary archive for the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) data; it is the only facility in the United States that archives, processes, and distributes data from the Advanced Spaceborne Thermal Emission/Reflection Radiometer (ASTER) on NASA's Terra spacecraft; and it is responsible for the archive and distribution of “land products” generated from data acquired by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Terra and Aqua satellites.
Next-generation optical wireless communications for data centers
NASA Astrophysics Data System (ADS)
Arnon, Shlomi
2015-01-01
Data centers collect and process information with a capacity that has been increasing from year to year at an almost exponential pace. Traditional fiber/cable data center network interconnections suffer from bandwidth overload, as well as flexibility and scalability issues. Therefore, a technology-shift from the fiber and cable to wireless has already been initiated in order to meet the required data-rate, flexibility and scalability demands for next-generation data center network interconnects. In addition, the shift to wireless reduces the volume allocated to the cabling/fiber and increases the cooling efficiency. Optical wireless communication (OWC), or free space optics (FSO), is one of the most effective wireless technologies that could be used in future data centers and could provide ultra-high capacity, very high cyber security and minimum latency, due to the low index of refraction of air in comparison to fiber technologies. In this paper we review the main concepts and configurations for next generation OWC for data centers. Two families of technologies are reviewed: the first technology regards interconnects between rack units in the same rack and the second technology regards the data center network that connects the server top of rack (TOR) to the switch. A comparison between different network technologies is presented.
NASA Technical Reports Server (NTRS)
2000-01-01
The Earth Observing System (EOS) is an integral part of the National Aeronautics and Space Administration's (NASA's) Earth Science Enterprise (ESE). ESE is a long-term global change research program designed to improve our understanding of the Earth's interrelated processes involving the atmosphere, oceans, land surfaces, and polar regions. Data from EOS instruments and other Earth science measurement systems are useful in understanding the causes and processes of global climate change and the consequences of human activities. The EOS Data and Information System (EOSDIS) provides a structure for data management and user services for products derived from EOS satellite instruments and other NASA Earth science data. Within the EOSDIS framework, the Distributed Active Archive Centers (DAACs) have been established to provide expertise in one or more Earth science disciplines. The DAACs and cooperating data centers provide data and information services to support the global change research community. Much of the development of the DAACs has been in anticipation of the enormous amount of data expected from EOS instruments to be launched within the next two decades. Terra, the EOS flagship launched in December 1999, is the first of a series of EOS satellites to carry several instruments with multispectral capabilities. Some data products from these instruments are now available from several of the DAACs. These and other data products can be ordered through the EOS Data Gateway (EDG) and DAAC-specific online ordering systems.
ERIC Educational Resources Information Center
Appenzellar, Anne B.; Kelley, H. Paul
The Measurement and Evaluation Center of the University of Texas (Austin) conducted a validity study to assist the Department of Management Science and Information (DMSI) at the College of Business Administration in establishing a program of credit by examination for an introductory course in electronic data processing--Data Processing Analysis…
NASA Astrophysics Data System (ADS)
Czapski, Paweł
2016-07-01
We are going to show the latest achievements of the Remote Sensing Division of the Institute of Aviation in the area of remote sensing, i.e. the project of the integrated solution for the whole remote sensing process ranging from acquiring to providing the end user with required information. Currently, these tasks are partially performed by several centers in Poland, however there is no leader providing an integrated solution. Motivated by this fact, the Earth Observation Mission Control Centre (EOMC2) was established in the Remote Sensing Division of the Institute of Aviation that will provide such a comprehensive approach. Establishing of EOMC2 can be compared with creating Data Center Aerial and Satellite Data Centre (OPOLIS) in the Institute of Geodesy and Cartography in the mid-70s in Poland. OPOLIS was responsible for broadly defined data processing, it was a breakthrough innovation that initiated the use of aerial image analysis in Poland. Operation center is a part of the project that will be created, which in comparison with the competitors will provide better solutions, i.e.: • Centralization of the acquiring, processing, publishing and archiving of data, • Implementing elements of the INSPIRE directive recommendations on spatial data management, • Providing the end-user with information in the near real-time, • Ability of supplying the system with images of various origin (aerial, satellite, e.g. EUMETCast, Sentinel, Landsat) and diversity of telemetry data, data aggregation and using the same algorithms to images obtained from different sources, • System reconfiguration and batch processing of large data sets at any time, • A wide range of potential applications: precision agriculture, environmental protection, crisis management and national security, aerial, small satellite and sounding rocket missions monitoring.
A radar data processing and enhancement system
NASA Technical Reports Server (NTRS)
Anderson, K. F.; Wrin, J. W.; James, R.
1986-01-01
This report describes the space position data processing system of the NASA Western Aeronautical Test Range. The system is installed at the Dryden Flight Research Facility of NASA Ames Research Center. This operational radar data system (RADATS) provides simultaneous data processing for multiple data inputs and tracking and antenna pointing outputs while performing real-time monitoring, control, and data enhancement functions. Experience in support of the space shuttle and aeronautical flight research missions is described, as well as the automated calibration and configuration functions of the system.
Breed, Greg A.; Golson, Emily A.; Tinker, M. Tim
2017-01-01
The home‐range concept is central in animal ecology and behavior, and numerous mechanistic models have been developed to understand home range formation and maintenance. These mechanistic models usually assume a single, contiguous home range. Here we describe and implement a simple home‐range model that can accommodate multiple home‐range centers, form complex shapes, allow discontinuities in use patterns, and infer how external and internal variables affect movement and use patterns. The model assumes individuals associate with two or more home‐range centers and move among them with some estimable probability. Movement in and around home‐range centers is governed by a two‐dimensional Ornstein‐Uhlenbeck process, while transitions between centers are modeled as a stochastic state‐switching process. We augmented this base model by introducing environmental and demographic covariates that modify transition probabilities between home‐range centers and can be estimated to provide insight into the movement process. We demonstrate the model using telemetry data from sea otters (Enhydra lutris) in California. The model was fit using a Bayesian Markov Chain Monte Carlo method, which estimated transition probabilities, as well as unique Ornstein‐Uhlenbeck diffusion and centralizing tendency parameters. Estimated parameters could then be used to simulate movement and space use that was virtually indistinguishable from real data. We used Deviance Information Criterion (DIC) scores to assess model fit and determined that both wind and reproductive status were predictive of transitions between home‐range centers. Females were less likely to move between home‐range centers on windy days, less likely to move between centers when tending pups, and much more likely to move between centers just after weaning a pup. These tendencies are predicted by theoretical movement rules but were not previously known and show that our model can extract meaningful behavioral insight from complex movement data.
Kozar, Mark D.; Kahle, Sue C.
2013-01-01
This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data collected as part of long-term water-level monitoring networks.
Evolving a NASA Digital Object Identifiers System with Community Engagement
NASA Astrophysics Data System (ADS)
Wanchoo, L.; James, N.
2016-12-01
In 2010, NASA's Earth Science Data and Information System (ESDIS) Project began investigating the assignment of unique identifiers to its suite of data products being stewarded at data centers distributed across the country. This process led to the use of Digital Object Identifiers (DOIs) and the development of an automated system for the registration of these DOIs. Since that time, the ESDIS DOI registration system has evolved to be fully functional with over 3000 publicly accessible DOIs and over 1000 being held in reserve status until the information required for registration is obtained. The goal is to assign DOIs to the entire 7000+ data collections under ESDIS management via its network of discipline-oriented data centers. A key factor in the successful evolution of the DOI registration system has been the incorporation of community input. Over the last 3 years, ESDIS has solicited community input for making the DOI registration process more efficient through three focus groups under NASA's Earth Science Data System Working Group (ESDSWG). These groups were largely composed of DOI submitters and data curators from the 12 data centers serving user communities of various science disciplines. The suggestions from these groups were formulated into recommendations for ESDIS consideration and implementation. This poster will describe the process and the activities of each focus group, their recommendations, and how these recommendations were implemented.
Creative user-centered visualization design for energy analysts and modelers.
Goodwin, Sarah; Dykes, Jason; Jones, Sara; Dillingham, Iain; Dove, Graham; Duffy, Alison; Kachkaev, Alexander; Slingsby, Aidan; Wood, Jo
2013-12-01
We enhance a user-centered design process with techniques that deliberately promote creativity to identify opportunities for the visualization of data generated by a major energy supplier. Visualization prototypes developed in this way prove effective in a situation whereby data sets are largely unknown and requirements open - enabling successful exploration of possibilities for visualization in Smart Home data analysis. The process gives rise to novel designs and design metaphors including data sculpting. It suggests: that the deliberate use of creativity techniques with data stakeholders is likely to contribute to successful, novel and effective solutions; that being explicit about creativity may contribute to designers developing creative solutions; that using creativity techniques early in the design process may result in a creative approach persisting throughout the process. The work constitutes the first systematic visualization design for a data rich source that will be increasingly important to energy suppliers and consumers as Smart Meter technology is widely deployed. It is novel in explicitly employing creativity techniques at the requirements stage of visualization design and development, paving the way for further use and study of creativity methods in visualization design.
WFIRST: STScI Science Operations Center (SSOC) Activities and Plans
NASA Astrophysics Data System (ADS)
Gilbert, Karoline M.; STScI WFIRST Team
2018-01-01
The science operations for the WFIRST Mission will be distributed between Goddard Space Flight Center, the Space Telescope Science Institute (STScI), and the Infrared Processing and Analysis Center (IPAC). The STScI Science Operations Center (SSOC) will schedule and archive all WFIRST observations, will calibrate and produce pipeline-reduced data products for the Wide Field Instrument, and will support the astronomical community in planning WFI observations and analyzing WFI data. During the formulation phase, WFIRST team members at STScI have developed operations concepts for scheduling, data management, and the archive; have performed technical studies investigating the impact of WFIRST design choices on data quality and analysis; and have built simulation tools to aid the community in exploring WFIRST’s capabilities. We will highlight examples of each of these efforts.
Inside the Black Box: The Case Review Process of an Elder Abuse Forensic Center.
Navarro, Adria E; Wysong, Julia; DeLiema, Marguerite; Schwartz, Elizabeth L; Nichol, Michael B; Wilber, Kathleen H
2016-08-01
Preliminary evidence suggests that elder abuse forensic centers improve victim welfare by increasing necessary prosecutions and conservatorships and reducing the recurrence of protective service referrals. Center team members gather information and make decisions designed to protect clients and their assets, yet the collective process of how these case reviews are conducted remains unexamined. The purpose of this study is to present a model describing the interprofessional approach of investigation and response to financial exploitation (FE), a frequent and complex type of abuse of vulnerable adults. To develop an understanding of the case review process at the Los Angeles County Elder Abuse Forensic Center (Center), a quasi-Delphi field study approach was used involving direct observations of meetings, surveying team members, and review from the Center's Advisory Council. The goal of this iterative analysis was to understand the case review process for suspected FE in Los Angeles County. A process map of key forensic center elements was developed that may be useful for replication in other settings. The process map includes: (a) multidisciplinary data collection, (b) key decisions for consideration, and (c) strategic actions utilized by an interprofessional team focused on elder justice. Elder justice relies on a complex system of providers. Elder abuse forensic centers provide a process designed to efficiently address client safety, client welfare, and protection of assets. Study findings provide a process map that may help other communities replicate an established multidisciplinary team, one experienced with justice system outcomes designed to protect FE victims. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Data Management Facility Operations Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keck, Nicole N
2014-06-30
The Data Management Facility (DMF) is the data center that houses several critical Atmospheric Radiation Measurement (ARM) Climate Research Facility services, including first-level data processing for the ARM Mobile Facilities (AMFs), Eastern North Atlantic (ENA), North Slope of Alaska (NSA), Southern Great Plains (SGP), and Tropical Western Pacific (TWP) sites, as well as Value-Added Product (VAP) processing, development systems, and other network services.
Research and Technology Report. Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Soffen, Gerald (Editor); Truszkowski, Walter (Editor); Ottenstein, Howard (Editor); Frost, Kenneth (Editor); Maran, Stephen (Editor); Walter, Lou (Editor); Brown, Mitch (Editor)
1996-01-01
This issue of Goddard Space Flight Center's annual report highlights the importance of mission operations and data systems covering mission planning and operations; TDRSS, positioning systems, and orbit determination; ground system and networks, hardware and software; data processing and analysis; and World Wide Web use. The report also includes flight projects, space sciences, Earth system science, and engineering and materials.
The George C. Marshall Space Flight Center High Reynolds Number Wind Tunnel Technical Handbook
NASA Technical Reports Server (NTRS)
Gwin, H. S.
1975-01-01
The High Reynolds Number Wind Tunnel at the George C. Marshall Space Flight Center is described. The following items are presented to illustrate the operation and capabilities of the facility: facility descriptions and specifications, operational and performance characteristics, model design criteria, instrumentation and data recording equipment, data processing and presentation, and preliminary test information required.
ERIC Educational Resources Information Center
Natriello, Gary
This review of the current data collection activities of the National Center for Education Statistics (NCES) is divided into two parts: a section of major recommendations applying to NCES plans in general, and a section reviewing each data collection activity and presenting specific suggestions. Section I recommends that NCES should: (1)…
Sadot, Dan; Dorman, G; Gorshtein, Albert; Sonkin, Eduard; Vidal, Or
2015-01-26
112Gbit/sec DSP-based single channel transmission of PAM4 at 56Gbaud over 15GHz of effective analog bandwidth is experimentally demonstrated. The DSP enables use of mature 25G optoelectronics for 2-10km datacenter intra-connections, and 8Tbit/sec over 80km interconnections between data centers.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Series Other Information Collaborators In-House Website Transition to Operations Presentations
Tseng, Jocelyn; Samagh, Sonia; Fraser, Donna; Landman, Adam B
2018-06-01
Despite considerable investment in digital health (DH) companies and a growing DH ecosystem, there are multiple challenges to testing and implementing innovative solutions. Health systems have recognized the potential of DH and have formed DH innovation centers. However, limited information is available on DH innovation center processes, best practices, or outcomes. This case report describes a DH innovation center process that can be replicated across health systems and defines and benchmarks process indicators to assess DH innovation center performance. The Brigham and Women's Hospital's Digital Health Innovation Group (DHIG) accelerates DH innovations from idea to pilot safely and efficiently using a structured process. Fifty-four DH innovations were accelerated by the DHIG process between July 2014 and December 2016. In order to measure effectiveness of the DHIG process, key process indicators were defined as 1) number of solutions that completed each DHIG phase and 2) length of time to complete each phase. Twenty-three DH innovations progressed to pilot stage and 13 innovations were terminated after barriers to pilot implementation were identified by the DHIG process. For 4 DH solutions that executed a pilot, the average time for innovations to proceed from DHIG intake to pilot initiation was 9 months. Overall, the DHIG is a reproducible process that addresses key roadblocks in DH innovation within health systems. To our knowledge, this is the first report to describe DH innovation process indicators and results within an academic health system. Therefore, there is no published data to compare our results with the results of other DH innovation centers. Standardized data collection and indicator reporting could allow benchmark comparisons across institutions. Additional opportunities exist for the validation of DH solution effectiveness and for translational support from pilot to implementation. These are critical steps to advance DH technologies and effectively leverage the DH ecosystem to transform healthcare. Copyright © 2017 Elsevier Inc. All rights reserved.
Development of a fast framing detector for electron microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Ian J.; Bustillo, Karen C.; Ciston, Jim
2016-10-01
A high frame rate detector system is described that enables fast real-time data analysis of scanning diffraction experiments in scanning transmission electron microscopy (STEM). This is an end-to-end development that encompasses the data producing detector, data transportation, and real-time processing of data. The detector will consist of a central pixel sensor that is surrounded by annular silicon diodes. Both components of the detector system will synchronously capture data at almost 100 kHz frame rate, which produces an approximately 400 Gb/s data stream. Low-level preprocessing will be implemented in firmware before the data is streamed from the National Center for Electronmore » Microscopy (NCEM) to the National Energy Research Scientific Computing Center (NERSC). Live data processing, before it lands on disk, will happen on the Cori supercomputer and aims to present scientists with prompt experimental feedback. This online analysis will provide rough information of the sample that can be utilized for sample alignment, sample monitoring and verification that the experiment is set up correctly. Only a compressed version of the relevant data is then selected for more in-depth processing.« less
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
Southwest Watershed Research Center Data Access Project
NASA Astrophysics Data System (ADS)
Nichols, M. H.; Anson, E.
2008-05-01
Hydrologic data, including rainfall and runoff data, have been collected on experimental watersheds operated by the U.S. Department of Agriculture Agricultural Research Service (USDA-ARS) in southern Arizona since the 1950s. These data are of national and international importance and make up one of the most comprehensive semiarid watershed data sets in the world. The USDA-ARS Southwest Watershed Research Center has recently developed an electronic data processing system that includes an online interface (http://tucson.ars.ag.gov/dap) to provide public access to the data. The goal of the system is to promote analyses and interpretations of historic and current data by improving data access. Data are collected from sensors in the field and are transmitted to computers in the office. The data are then processed, quality checked, and made available to users via the Internet. The publicly accessible part of the system consists of an interactive Web site, which provides an interface to the data, and a relational database, which is used to process, store, and manage data. The system was released to the public in October 2003, and since that time the online data access Web site has received more than 4500 visitors.
PILOT: An intelligent distributed operations support system
NASA Technical Reports Server (NTRS)
Rasmussen, Arthur N.
1993-01-01
The Real-Time Data System (RTDS) project is exploring the application of advanced technologies to the real-time flight operations environment of the Mission Control Centers at NASA's Johnson Space Center. The system, based on a network of engineering workstations, provides services such as delivery of real time telemetry data to flight control applications. To automate the operation of this complex distributed environment, a facility called PILOT (Process Integrity Level and Operation Tracker) is being developed. PILOT comprises a set of distributed agents cooperating with a rule-based expert system; together they monitor process operation and data flows throughout the RTDS network. The goal of PILOT is to provide unattended management and automated operation under user control.
DORIS and GNSS processing at CNES/CLS for the contribution to the next ITRF2013
NASA Astrophysics Data System (ADS)
Loyer, Sylvain; Capdeville, Hugues; Soudarin, Laurent; Mezerette, Adrien; Lemoine, Jean-Michel; Mercier, Flavien; Perosanz, Felix
2014-05-01
CNES serves as Analysis Center in the International DORIS Service (IDS) and the International GNSS Service (IGS). DORIS and GNSS data are processed by its subsidiary CLS with the GRGS package software GINS/DYNAMO. For the contribution to the next release of the International Terrestrial Reference Frame planned this year (ITRF2013), two decades of data were analyzed (1993-2013 for DORIS, 1998-2013 for GPS, and 2009-2013 for GLONASS). In this context, the CNES/CLS Analysis Centers provided SINEX solutions to the IDS and IGS Combination Centers, respectively multi-satellite weekly solutions and daily solutions. Normal equations derived from this analysis are also made available to the GRGS Combination Center for the combination at the observation level of the geodetic parameters measured by DORIS, GPS, SLR and VLBI techniques. The purpose of this presentation is to point out how the overall quality of the DORIS and GNSS data processing benefits from the use of the same software and a common basis of models. Here, we present the modeling standards, the networks and the processing strategies. Assessments of some models are also discussed. The quality and the homogeneity of the products (orbits, station coordinates and Earth Orientation Parameters) over the complete period are shown, as well as the temporal variations of some parameters (dynamical parameters, orbit residuals, internal orbit overlaps ...). Some examples of time series of DORIS and GNSS station positions at collocated sites complete this presentation.
Improving Access to NASA Earth Science Data through Collaborative Metadata Curation
NASA Astrophysics Data System (ADS)
Sisco, A. W.; Bugbee, K.; Shum, D.; Baynes, K.; Dixon, V.; Ramachandran, R.
2017-12-01
The NASA-developed Common Metadata Repository (CMR) is a high-performance metadata system that currently catalogs over 375 million Earth science metadata records. It serves as the authoritative metadata management system of NASA's Earth Observing System Data and Information System (EOSDIS), enabling NASA Earth science data to be discovered and accessed by a worldwide user community. The size of the EOSDIS data archive is steadily increasing, and the ability to manage and query this archive depends on the input of high quality metadata to the CMR. Metadata that does not provide adequate descriptive information diminishes the CMR's ability to effectively find and serve data to users. To address this issue, an innovative and collaborative review process is underway to systematically improve the completeness, consistency, and accuracy of metadata for approximately 7,000 data sets archived by NASA's twelve EOSDIS data centers, or Distributed Active Archive Centers (DAACs). The process involves automated and manual metadata assessment of both collection and granule records by a team of Earth science data specialists at NASA Marshall Space Flight Center. The team communicates results to DAAC personnel, who then make revisions and reingest improved metadata into the CMR. Implementation of this process relies on a network of interdisciplinary collaborators leveraging a variety of communication platforms and long-range planning strategies. Curating metadata at this scale and resolving metadata issues through community consensus improves the CMR's ability to serve current and future users and also introduces best practices for stewarding the next generation of Earth Observing System data. This presentation will detail the metadata curation process, its outcomes thus far, and also share the status of ongoing curation activities.
Bulgarian National Digital Seismological Network
NASA Astrophysics Data System (ADS)
Dimitrova, L.; Solakov, D.; Nikolova, S.; Stoyanov, S.; Simeonova, S.; Zimakov, L. G.; Khaikin, L.
2011-12-01
The Bulgarian National Digital Seismological Network (BNDSN) consists of a National Data Center (NDC), 13 stations equipped with RefTek High Resolution Broadband Seismic Recorders - model DAS 130-01/3, 1 station equipped with Quanterra 680 and broadband sensors and accelerometers. Real-time data transfer from seismic stations to NDC is realized via Virtual Private Network of the Bulgarian Telecommunication Company. The communication interruptions don't cause any data loss at the NDC. The data are backed up in the field station recorder's 4Mb RAM memory and are retransmitted to the NDC immediately after the communication link is re-established. The recorders are equipped with 2 compact flash disks able to save more than 1 month long data. The data from the flash disks can be downloaded remotely using FTP. The data acquisition and processing hardware redundancy at the NDC is achieved by two clustered SUN servers and two Blade Workstations. To secure the acquisition, processing and data storage processes a three layer local network is designed at the NDC. Real-time data acquisition is performed using REFTEK's full duplex error-correction protocol RTPD. Data from the Quanterra recorder and foreign stations are fed into RTPD in real-time via SeisComP/SeedLink protocol. Using SeisComP/SeedLink software the NDC transfers real-time data to INGV-Roma, NEIC-USA, ORFEUS Data Center. Regional real-time data exchange with Romania, Macedonia, Serbia and Greece is established at the NDC also. Data processing is performed by the Seismic Network Data Processor (SNDP) software package running on the both Servers. SNDP includes subsystems: Real-time subsystem (RTS_SNDP) - for signal detection; evaluation of the signal parameters; phase identification and association; source estimation; Seismic analysis subsystem (SAS_SNDP) - for interactive data processing; Early warning subsystem (EWS_SNDP) - based on the first arrived P-phases. The signal detection process is performed by traditional STA/LTA detection algorithm. The filter parameters of the detectors are defined on the base of previously evaluated ambient noise at the seismic stations. Some extra modules for network command/control, state-of-health network monitoring and data archiving are running as well in the National Data Center. Three types of archives are produced in the NDC - two continuous - miniSEED format and RefTek PASSCAL format; and one event oriented in CSS3.0 scheme format. Modern digital equipment and broad-band seismometers installed at Bulgarian seismic stations, careful selection of the software packages for automatic and interactive data processing in the data center proved to be suitable choice for the purposes of BNDSN and NDC: ? to ensure reliable automatic localization of the seismic events and rapid notification of the governmental authorities in case of felt earthquakes on the territory of Bulgaria; ? to provide a modern basis for seismological studies in Bulgaria.
NASA Astrophysics Data System (ADS)
Beach, A. L., III; Northup, E. A.; Early, A. B.; Chen, G.
2016-12-01
Airborne field studies are an effective way to gain a detailed understanding of atmospheric processes for scientific research on climate change and air quality relevant issues. One major function of airborne project data management is to maintain seamless data access within the science team. This allows individual instrument principal investigators (PIs) to process and validate their own data, which requires analysis of data sets from other PIs (or instruments). The project's web platform streamlines data ingest, distribution processes, and data format validation. In May 2016, the NASA Langley Research Center (LaRC) Atmospheric Science Data Center (ASDC) developed a new data management capability to help support the Korea U.S.-Air Quality (KORUS-AQ) science team. This effort is aimed at providing direct NASA Distributed Active Archive Center (DAAC) support to an airborne field study. Working closely with the science team, the ASDC developed a scalable architecture that allows investigators to easily upload and distribute their data and documentation within a secure collaborative environment. The user interface leverages modern design elements to intuitively guide the PI through each step of the data management process. In addition, the new framework creates an abstraction layer between how the data files are stored and how the data itself is organized(i.e. grouping files by PI). This approach makes it easy for PIs to simply transfer their data to one directory, while the system itself can automatically group/sort data as needed. Moreover, the platform is "server agnostic" to a certain degree, making deployment and customization more straightforward as hardware needs change. This flexible design will improve development efficiency and can be leveraged for future field campaigns. This presentation will examine the KORUS-AQ data portal as a scalable solution that applies consistent and intuitive usability design practices to support ingest and management of airborne data.
NLSI Focus Group on Missing ALSEP Data Recovery: Progress and Plans
NASA Technical Reports Server (NTRS)
Lewis, L. R.; Nakamura, Y.; Nagihara, S.; Williams, D. R.; Chi, P.; Taylor, P. T.; Schmidt, G. K.; Grayzeck, E. J.
2011-01-01
On the six Apollo landed missions, the Astronauts deployed the Apollo Lunar Surface Experiments Package (ALSEP) science stations which measured active and passive seismic events, magnetic fields, charged particles, solar wind, heat flow, the diffuse atmosphere, meteorites and their ejecta, lunar dust, etc. Today's scientists are able to extract new information and make new discoveries from the old ALSEP data utilizing recent advances in computer capabilities and new analysis techniques. However, current-day investigators are encountering problems trying to use the ALSEP data. In 2007 archivists from NASA Goddard Space Flight Center (GSFC) National Space Science Data Center (NSSDC) estimated only about 50 percent of the processed ALSEP lunar surface data-of-interest to current lunar science investigators were in the NSSDC archives. The current-day lunar science investigators found most of the ALSEP data, then in the NSSDC archives. were extremely difficult to use. The data were in forms often not well described in the published reports and rerecording anomalies existed in the data which could only be resolved by tape experts. To resolve this problem, the DPS Lunar Data Node was established in 2008 at NSSDC and is in the process of successfully making the existing archived ALSEP data available to current-day investigators in easily useable forms. In July of 2010 the NASA Lunar Science Institute (NLSI) at Ames Research Center established the Recovery of Missing ALSEP Data Focus Group in recognition of the importance of the current activities to find the raw and processed ALSEP data missing from the NSSDC archives.
Detecting Suspended Sediments from Remote Sensed Data in the Northern Gulf of Mexico
NASA Astrophysics Data System (ADS)
Hardin, D. M.; Graves, S. J.; Hawkins, L.; He, M.; Smith, T.; Drewry, M.; Ebersole, S.; Travis, A.; Thorn, J.; Brown, B.
2012-12-01
The Sediment Analysis Network for Decision Support (SANDS) project utilized remotely sensed data from Landsat and MODIS, both prior and following landfall, to investigate suspended sediment and sediment redistribution. The satellite imagery was enhanced by applying a combination of cluster busting and classification techniques to color and infrared bands. Results from the process show patterns associated with sediment transport and deposition related to coastal processes, storm-related sediment transport, post-storm pollutant transport, and sediment-current interactions. Imagery prior to landfall and following landfall are shown to the left for Landsat and to the right for MODIS. Scientific analysis and production of enhanced imagery was conducted by the Geological Survey of Alabama. The Information Technology and Systems Center at the University of Alabama in Huntsville was responsible for data acquisition, development of the SANDS data portal and the archive and distribution through the Global Hydrology Resource Center, one of NASA's Earth Science Data Centers . SANDs data may be obtained from the GHRC at ghrc.nsstc.nasa.gov and from the SANDS data portal at sands.itsc.uah.edu. This project was funded by the NASA Applied Sciences Division
Ponce, David A.
1997-01-01
Gravity data for the entire state of Nevada and adjacent parts of California, Utah, and Arizona are available on this CD-ROM. About 80,000 gravity stations were compiled primarily from the National Geophysical Data Center and the U.S. Geological Survey. Gravity data was reduced to the Geodetic Reference System of 1967 and adjusted to the Gravity Standardization Net 1971 gravity datum. Data were processed to complete Bouguer and isostatic gravity anomalies by applying standard gravity corrections including terrain and isostatic corrections. Selected principal fact references and a list of sources for data from the National Geophysical Data Center are included.
ERIC Educational Resources Information Center
Florida State Community Coll. Coordinating Board, Tallahassee.
In 1987-88, the Florida State Board of Community Colleges and the Division of Vocational, Adult, and Community Education jointly conducted a review of instructional programs in computer science and data processing in order to determine needs for state policy changes and funding priorities. The process involved a review of printed resources on…
The Joint Distribution Process Analysis Center (JDPAC): Background and Current Capability
2007-06-12
Systems Integration and Data Management JDDE Analysis/Global Distribution Performance Assessment Futures/Transformation Analysis Balancing Operational Art ... Science JDPAC “101” USTRANSCOM Future Operations Center SDDC – TEA Army SES (Dual Hat) • Transportability Engineering • Other Title 10
Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center
NASA Astrophysics Data System (ADS)
Ruppert, N. A.; Hansen, R. A.
2007-05-01
The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.
NASA Technical Reports Server (NTRS)
Goodman, S. J.; Lapenta, W.; Jedlovec, G.; Dodge, J.; Bradshaw, T.
2003-01-01
The NASA Short-term Prediction Research and Transition (SPoRT) Center in Huntsville, Alabama was created to accelerate the infusion of NASA earth science observations, data assimilation and modeling research into NWS forecast operations and decision-making. The principal focus of experimental products is on the regional scale with an emphasis on forecast improvements on a time scale of 0-24 hours. The SPoRT Center research is aligned with the regional prediction objectives of the US Weather Research Program dealing with 0-1 day forecast issues ranging from convective initiation to 24-hr quantitative precipitation forecasting. The SPoRT Center, together with its other interagency partners, universities, and the NASA/NOAA Joint Center for Satellite Data Assimilation, provides a means and a process to effectively transition NASA Earth Science Enterprise observations and technology to National Weather Service operations and decision makers at both the global/national and regional scales. This paper describes the process for the transition of experimental products into forecast operations, current products undergoing assessment by forecasters, and plans for the future.
NASA's Earth Observing Data and Information System
NASA Technical Reports Server (NTRS)
Mitchell, Andrew E.; Behnke, Jeanne; Lowe, Dawn; Ramapriyan, H. K.
2009-01-01
NASA's Earth Observing System Data and Information System (EOSDIS) has been a central component of NASA Earth observation program for over 10 years. It is one of the largest civilian science information system in the US, performing ingest, archive and distribution of over 3 terabytes of data per day much of which is from NASA s flagship missions Terra, Aqua and Aura. The system supports a variety of science disciplines including polar processes, land cover change, radiation budget, and most especially global climate change. The EOSDIS data centers, collocated with centers of science discipline expertise, archive and distribute standard data products produced by science investigator-led processing systems. Key to the success of EOSDIS is the concept of core versus community requirements. EOSDIS supports a core set of services to meet specific NASA needs and relies on community-developed services to meet specific user needs. EOSDIS offers a metadata registry, ECHO (Earth Observing System Clearinghouse), through which the scientific community can easily discover and exchange NASA s Earth science data and services. Users can search, manage, and access the contents of ECHO s registries (data and services) through user-developed and community-tailored interfaces or clients. The ECHO framework has become the primary access point for cross-Data Center search-and-order of EOSDIS and other Earth Science data holdings archived at the EOSDIS data centers. ECHO s Warehouse Inventory Search Tool (WIST) is the primary web-based client for discovering and ordering cross-discipline data from the EOSDIS data centers. The architecture of the EOSDIS provides a platform for the publication, discovery, understanding and access to NASA s Earth Observation resources and allows for easy integration of new datasets. The EOSDIS also has developed several methods for incorporating socioeconomic data into its data collection. Over the years, we have developed several methods for determining needs of the user community including use of the American Customer Satisfaction Index and a broad metrics program.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
NASA Technical Reports Server (NTRS)
1974-01-01
The specifications and functions of the Central Data Processing (CDPF) Facility which supports the Earth Observatory Satellite (EOS) are discussed. The CDPF will receive the EOS sensor data and spacecraft data through the Spaceflight Tracking and Data Network (STDN) and the Operations Control Center (OCC). The CDPF will process the data and produce high density digital tapes, computer compatible tapes, film and paper print images, and other data products. The specific aspects of data inputs and data processing are identified. A block diagram of the CDPF to show the data flow and interfaces of the subsystems is provided.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page Author: EMC
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Brenton, James C.
2016-01-01
An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.
Krueger, Ute; Schimmelpfeng, Katja
2013-03-01
A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.
Engine Icing Data - An Analytics Approach
NASA Technical Reports Server (NTRS)
Fitzgerald, Brooke A.; Flegel, Ashlie B.
2017-01-01
Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.
A Framework for WWW Query Processing
NASA Technical Reports Server (NTRS)
Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)
2000-01-01
Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).
NASA Technical Reports Server (NTRS)
Angelici, Gary; Popovici, Lidia; Skiles, Jay
1991-01-01
The Pilot Land Data System (PLDS) is a data and information system serving NASA-supported investigators in the land science community. The three nodes of the PLDS, one each at the Ames Research Center (ARC), the Goddard Space Flight Center (GSFC) and the Jet Propulsion Laboratory (JPL), cooperate in providing consistent information describing the various data holding in the hardware and software (accessible via network and modem) that provide information about and access to PLDS-held data, which is available for distribution. A major new activity of the PLDS node at the Ames Research Center involves the interaction of the PLDS with an active NASA ecosystem science project, the Oregon Transect Ecosystems Research involves the management of, access to, and distribution of the large volume of widely-varying aircraft data collected by OTTER. The OTTER project, is managed by researchers at the Ames Research Center and Oregon State University. Its principal objective is to estimate major fluxes of carbon, nitrogen, and water of forest ecosystems using an ecosystem process model driven by remote sensing data. Ten researchers at NASA centers and universities are analyzing data for six sites along a temperature-moisture gradient across the western half of central Oregon (called the Oregon Transect). Sensors mounted on six different aircraft have acquired data over the Oregon Transect in support of the OTTER project.
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Berger, Katharina; Toussaint, Frank
2014-05-01
The integration of well-established legacy data centers into newly developed data federation infrastructures is a key requirement to enhance climate data access based on widely agreed interfaces. We present the approach taken to integrate the ICSU World Data Center for Climate (WDCC) located in Hamburg, Germany into the European ENES climate data Federation which is part of the international ESGF data federation. The ENES / ESGF data federation hosts petabytes of climate model data and provides scalable data search and access services across the worldwide distributed data centers. Parts of the data provided by the ENES / ESGF data federation is also long term archived and curated at the WDCC data archive, allowing e.g. for DOI based data citation. An integration of the WDCC into the ENES / ESGF federation allows end users to search and access WDCC data using consistent interfaces worldwide. We will summarize the integration approach we have taken for WDCC legacy system and ESGF infrastructure integration. On the technical side we describe the provisioning of ESGF consistent metadata and data interfaces as well as the security infrastructure adoption. On the non-technical side we describe our experiences in integrating a long-term archival center with costly quality assurance procedures with an integrated distributed data federation putting emphasis on providing early and consistent data search and access services to scientists. The experiences were gained in the process of curating ESGF hosted CMIP5 data at the WDCC. Approximately one petabyte of CMIP5 data which was used for the IPCC climate report is being replicated and archived at the WDCC.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page Author: EMC Webmaster Page
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Weather and Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page Author
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar for Weather and Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar ) 5830 University Research Court College Park, MD 20740 Page Author: EMC Webmaster Page generated:Sunday
The Value of Metrics for Science Data Center Management
NASA Astrophysics Data System (ADS)
Moses, J.; Behnke, J.; Watts, T. H.; Lu, Y.
2005-12-01
The Earth Observing System Data and Information System (EOSDIS) has been collecting and analyzing records of science data archive, processing and product distribution for more than 10 years. The types of information collected and the analysis performed has matured and progressed to become an integral and necessary part of the system management and planning functions. Science data center managers are realizing the importance that metrics can play in influencing and validating their business model. New efforts focus on better understanding of users and their methods. Examples include tracking user web site interactions and conducting user surveys such as the government authorized American Customer Satisfaction Index survey. This paper discusses the metrics methodology, processes and applications that are growing in EOSDIS, the driving requirements and compelling events, and the future envisioned for metrics as an integral part of earth science data systems.
Kennedy Space Center, Space Shuttle Processing, and International Space Station Program Overview
NASA Technical Reports Server (NTRS)
Higginbotham, Scott Alan
2011-01-01
Topics include: International Space Station assembly sequence; Electrical power substation; Thermal control substation; Guidance, navigation and control; Command data and handling; Robotics; Human and robotic integration; Additional modes of re-supply; NASA and International partner control centers; Space Shuttle ground operations.
Status of availability of Mariner 9 (1971-051A) TV picture data
NASA Technical Reports Server (NTRS)
1973-01-01
The Mariner 9 TV data that are now available from the National Space Science Data Center are described. Included are the mission test video system pictures, image processing laboratory/reduced data records, mosaics, and journal articles.
Use of Archived Information by the United States National Data Center
NASA Astrophysics Data System (ADS)
Junek, W. N.; Pope, B. M.; Roman-Nieves, J. I.; VanDeMark, T. F.; Ichinose, G. A.; Poffenberger, A.; Woods, M. T.
2012-12-01
The United States National Data Center (US NDC) is responsible for monitoring international compliance to nuclear test ban treaties, acquiring data and data products from the International Data Center (IDC), and distributing data according to established policy. The archive of automated and reviewed event solutions residing at the US NDC is a valuable resource for assessing and improving the performance of signal detection, event formation, location, and discrimination algorithms. Numerous research initiatives are currently underway that are focused on optimizing these processes using historic waveform data and alphanumeric information. Identification of optimum station processing parameters is routinely performed through the analysis of archived waveform data. Station specific detector tuning studies produce and compare receiver operating characteristics for multiple detector configurations (e.g., detector type, filter passband) to identify an optimum set of processing parameters with an acceptable false alarm rate. Large aftershock sequences can inundate automated phase association algorithms with numerous detections that are closely spaced in time, which increases the number of false and/or mixed associations in automated event solutions and increases analyst burden. Archived waveform data and alphanumeric information are being exploited to develop an aftershock processor that will construct association templates to assist the Global Association (GA) application, reduce the number of false and merged phase associations, and lessen analyst burden. Statistical models are being developed and evaluated for potential use by the GA application for identifying and rejecting unlikely preliminary event solutions. Other uses of archived data at the US NDC include: improved event locations using empirical travel time corrections and discrimination via a statistical framework known as the event classification matrix (ECM).
Application of participatory ergonomics to the redesign of the family-centered rounds process
Xie, Anping; Carayon, Pascale; Cox, Elizabeth D.; Cartmill, Randi; Li, Yaqiong; Wetterneck, Tosha B.; Kelly, Michelle M.
2015-01-01
Participatory ergonomics (PE) can promote the application of human factors and ergonomics (HFE) principles to healthcare system redesign. This study applied a PE approach to redesigning the family-centered rounds (FCR) process to improve family engagement. Various FCR stakeholders (e.g., patients and families, physicians, nurses, hospital management) were involved in different stages of the PE process. HFE principles were integrated in both the content (e.g., shared mental model, usability, workload consideration, systems approach) and process (e.g., top management commitment, stakeholder participation, communication and feedback, learning and training, project management) of FCR redesign. We describe activities of the PE process (e.g., formation and meetings of the redesign team, data collection activities, intervention development, intervention implementation) and present data on PE process evaluation. To demonstrate the value of PE-based FCR redesign, future research should document its impact on FCR process measures (e.g., family engagement, round efficiency) and patient outcome measures (e.g., patient satisfaction). PMID:25777042
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... Southeast Data, Assessment and Review (SEDAR) process, a multi-step method for determining the status of... Center. Participants include: data collectors and database managers; stock assessment scientists...
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar University Research Court College Park, MD 20740 Page Author: EMC Webmaster Page generated:Sunday, 27-May
Comparing the quality of preconception care provided in healthcare centers in Mashhad in 2012.
Sardasht, Fatemeh Ghaffari; Shourab, Nahid Jahani; Jafarnejad, Farzaneh; Esmaily, Habibollah
2015-01-01
Improving the quality of healthcare services is considered as the main strategy to improve maternal and neonatal health outcomes. Providing appropriate healthcare for mothers and their newborn children is facilitated significantly by considering the mothers' health and welfare before pregnancy occurs. Therefore, the aim of this study was to compare the quality of preconception care provided to women of reproductive age provided by five health centers in Mashhad in 2012 and 2013. Multi-stage sampling was used to select the participants in this descriptive study. As a result, 360 women of reproductive age and 39 healthcare providers from 24 healthcare centers in Mashhad were selected to participate. The data gathering tool was a checklist based on the Donabedian model that includes the three dimensions of structure, process, and outcome. The data were analyzed by SPSS software (version 11.5), Kruskal-Wallis tests, ANOVA, and Spearman rank correlation. The results showed that preconception care at the 24 healthcare centers had essentially the same conditions. But in the process and outcome components, the quality of the preconception care at five of the health centers was significantly different (p=0.008). The highest quality of care processes was identified at health center number 3. The difference in the component of outcomes being followed up by the healthcare providers at five of the health centers was statistically significant (p=0.000); however, there were no significant differences found among the satisfaction and awareness of the women who participated at the five health centers. The results showed that the performance of health personnel in providing preconception care and providing follow-up care was not satisfactory.
NASA Astrophysics Data System (ADS)
Plank, G.; Slater, D.; Torrisi, J.; Presser, R.; Williams, M.; Smith, K. D.
2012-12-01
The Nevada Seismological Laboratory (NSL) manages time-series data and high-throughput IP telemetry for the National Center for Nuclear Security (NCNS) Source Physics Experiment (SPE), underway on the Nevada National Security Site (NNSS). During active-source experiments, SPE's heterogeneous systems record over 350 channels of a variety of data types including seismic, infrasound, acoustic, and electro-magnetic. During the interim periods, broadband and short period instruments record approximately 200 channels of continuous, high-sample-rate seismic data. Frequent changes in sensor and station configurations create a challenging meta-data environment. Meta-data account for complete operational histories, including sensor types, serial numbers, gains, sample rates, orientations, instrument responses, data-logger types etc. To date, these catalogue 217 stations, over 40 different sensor types, and over 1000 unique recording configurations (epochs). Facilities for processing, backup, and distribution of time-series data currently span four Linux servers, 60Tb of disk capacity, and two data centers. Bandwidth, physical security, and redundant power and cooling systems for acquisition, processing, and backup servers are provided by NSL's Reno data center. The Nevada System of Higher Education (NSHE) System Computer Services (SCS) in Las Vegas provides similar facilities for the distribution server. NSL staff handle setup, maintenance, and security of all data management systems. SPE PIs have remote access to meta-data, raw data, and CSS3.0 compilations, via SSL-based transfers such as rsync or secure-copy, as well as shell access for data browsing and limited processing. Meta-data are continuously updated and posted on the Las Vegas distribution server as station histories are better understood and errors are corrected. Raw time series and refined CSS3.0 data compilations with standardized formats are transferred to the Las Vegas data server as available. For better data availability and station monitoring, SPE is beginning to leverage NSL's wide-area digital IP network with nine SPE stations and six Rock Valley area stations that stream continuous recordings in real time to the NSL Reno data center. These stations, in addition to eight regional legacy stations supported by National Security Technologies (NSTec), are integrated with NSL's regional monitoring network and constrain a high-quality local earthquake catalog for NNSS. The telemetered stations provide critical capabilities for SPE, and infrastructure for earthquake response on NNSS as well as southern Nevada and the Las Vegas area.
Data Assembly and Processing for Operational Oceanography: 10 Years of Achievements
2009-07-20
Processing for Operational Oceanography: 10 Years of Acheivements 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 0602435N 6... operational oceanography infrastructure. They provide data and products needed by modeling and data assimilation systems; they also provide products...directly useable for applications. The paper will discuss the role and functions of the data centers for operational oceanography and describe some of
2014-09-25
CAPE CANAVERAL, Fla. – Operations are underway to couple Florida East Coast Railway, or FEC, locomotives No. 433 and No. 428 on the track alongside the Indian River, north of Launch Complex 39 at NASA’s Kennedy Space Center in Florida. Kennedy's Center Planning and Development Directorate has enlisted the locomotives to support a Rail Vibration Test for the Canaveral Port Authority. The purpose of the test is to collect amplitude, frequency and vibration test data utilizing two Florida East Coast locomotives operating on KSC tracks to ensure that future railroad operations will not affect launch vehicle processing at the center. Buildings instrumented for the test include the Rotation Processing Surge Facility, Thermal Protection Systems Facility, Vehicle Assembly Building, Orbiter Processing Facility and Booster Fabrication Facility. Photo credit: NASA/Daniel Casper
2014-09-25
CAPE CANAVERAL, Fla. – Coupled Florida East Coast Railway, or FEC, locomotives No. 433 and No. 428 pass the Vehicle Assembly Building in Launch Complex 39 at NASA’s Kennedy Space Center in Florida on their way to NASA's Locomotive Maintenance Facility. Kennedy's Center Planning and Development Directorate has enlisted the locomotives to support a Rail Vibration Test for the Canaveral Port Authority. The purpose of the test is to collect amplitude, frequency and vibration test data utilizing two Florida East Coast locomotives operating on KSC tracks to ensure that future railroad operations will not affect launch vehicle processing at the center. Buildings instrumented for the test include the Rotation Processing Surge Facility, Thermal Protection Systems Facility, Vehicle Assembly Building, Orbiter Processing Facility and Booster Fabrication Facility. Photo credit: NASA/Daniel Casper
Review of LOGEX. Main Report and Appendixes A-I
1975-05-23
been developed on an RCA Spectra 70 machine located at the Army Logistics Management Center, Fort Lee, Virginia. This was undoubtedly an outstanding...Control Number ADP - Automatic Data Processing ACT - Active Duty for Training ALMC - US Army Logistics Management Center AMO - Ammunition AR - Amy...Directorate CPT McClellan, LOGEX Directorate CPT Weaver, LOGEX Directorate United States Army Logistics Management Center Mr. Loper Mr. Ross United States
An Experimental Seismic Data and Parameter Exchange System for Tsunami Warning Systems
NASA Astrophysics Data System (ADS)
Hoffmann, T. L.; Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Heinloo, A.; Hoffmann, M.
2009-12-01
For several years GFZ Potsdam is operating a global earthquake monitoring system. Since the beginning of 2008, this system is also used as an experimental seismic background data center for two different regional Tsunami Warning Systems (TWS), the IOTWS (Indian Ocean) and the interim NEAMTWS (NE Atlantic and Mediterranean). The SeisComP3 (SC3) software, developed within the GITEWS (German Indian Ocean Tsunami Early Warning System) project, capable to acquire, archive and process real-time data feeds, was extended for export and import of individual processing results within the two clusters of connected SC3 systems. Therefore not only real-time waveform data are routed to the attached warning centers through GFZ but also processing results. While the current experimental NEAMTWS cluster consists of SC3 systems in six designated national warning centers in Europe, the IOTWS cluster presently includes seven centers, with another three likely to join in 2009/10. For NEAMTWS purposes, the GFZ virtual real-time seismic network (GEOFON Extended Virtual Network -GEVN) in Europe was substantially extended by adding many stations from Western European countries optimizing the station distribution. In parallel to the data collection over the Internet, a GFZ VSAT hub for secured data collection of the EuroMED GEOFON and NEAMTWS backbone network stations became operational and first data links were established through this backbone. For the Southeast Asia region, a VSAT hub has been established in Jakarta already in 2006, with some other partner networks connecting to this backbone via the Internet. Since its establishment, the experimental system has had the opportunity to prove its performance in a number of relevant earthquakes. Reliable solutions derived from a minimum of 25 stations were very promising in terms of speed. For important events, automatic alerts were released and disseminated by emails and SMS. Manually verified solutions are added as soon as they become available. The results are also promising in terms of accuracy since epicenter coordinates, depth and magnitude estimates were sufficiently accurate from the very beginning, and usually do not differ substantially from the final solutions. In summary, automatic seismic event processing has shown to work well as a first step for starting a Tsunami Warning process. However, for the secured assessment of the tsunami potential of a given event, 24/7-manned regional TWCs are mandatory for reliable manual verification of the automatic seismic results. At this time, GFZ itself provides manual verification only when staff is available, not on a 24/7 basis, while the actual national tsunami warning centers have all a reliable 24/7 service.
Online Analysis Enhances Use of NASA Earth Science Data
NASA Technical Reports Server (NTRS)
Acker, James G.; Leptoukh, Gregory
2007-01-01
Giovanni, the Goddard Earth Sciences Data and Information Services Center (GES DISC) Interactive Online Visualization and Analysis Infrastructure, has provided researchers with advanced capabilities to perform data exploration and analysis with observational data from NASA Earth observation satellites. In the past 5-10 years, examining geophysical events and processes with remote-sensing data required a multistep process of data discovery, data acquisition, data management, and ultimately data analysis. Giovanni accelerates this process by enabling basic visualization and analysis directly on the World Wide Web. In the last two years, Giovanni has added new data acquisition functions and expanded analysis options to increase its usefulness to the Earth science research community.
Surface-water quality-assurance plan for the U.S. Geological Survey Washington Water Science Center
Mastin, Mark C.
2016-02-19
This Surface-Water Quality-Assurance Plan documents the standards, policies, and procedures used by the U.S. Geological Survey Washington Water Science Center (WAWSC) for activities related to the collection, processing, storage, analysis, and publication of surface-water data. This plan serves as a guide to all WAWSC personnel involved in surface-water data activities, and changes as the needs and requirements of the WAWSC change. Regular updates to this plan represent an integral part of the quality-assurance process. In the WAWSC, direct oversight and responsibility by the hydrographer(s) assigned to a surface-water station, combined with team approaches in all work efforts, assure highquality data, analyses, reviews, and reports for cooperating agencies and the public.
Demonstration of laser speckle system on burner liner cyclic rig
NASA Technical Reports Server (NTRS)
Stetson, K. A.
1986-01-01
A demonstration test was conducted to apply speckle photogrammetry to the measurement of strains on a sample of combustor liner material in a cyclic fatigue rig. A system for recording specklegrams was assembled and shipped to the NASA Lewis Research Center, where it was set up and operated during rig tests. Data in the form of recorded specklegrams were sent back to United Technologies Research Center for processing to extract strains. Difficulties were found in the form of warping and bowing of the sample during the tests which degraded the data. Steps were taken by NASA personnel to correct this problem and further tests were run. Final data processing indicated erratic patterns of strain on the burner liner sample.
The SAMPEX Data Center and User Interface for the Heliophysics Community
NASA Astrophysics Data System (ADS)
Davis, A. J.; Kanekal, S. G.; Looper, M. D.; Mazur, J. E.
2012-12-01
The Solar, Anomalous, Magnetospheric Particle Explorer (SAMPEX) was the first of NASA's Small Explorer (SMEX) series. SAMPEX was launched July 3, 1992 into a 520 by 670 km orbit at 82 degrees inclination. SAMPEX carries four instruments designed to study energetic particles of solar, interplanetary, and magnetospheric origin, as well as "anomalous" and galactic cosmic rays. As an outcome of the Senior Review process, the NASA SAMPEX science mission ended on June 30, 2004, leaving a 12-year continuous record of observations. (The spacecraft and instruments are still operating and returning science data under a partnership between NASA and the Aerospace Corporation). SAMPEX was launched before the development of the WWW and implementation of NASA's open data policy. This, and the complexity of the data analysis have made it difficult for the general community to make full use of the SAMPEX science data set. The SAMPEX Data Center remedies the situation. The data center set-up and operation was funded for 3 years by NASA, and it remains in operation. The goals of the data center are to enable community access to the full SAMPEX data set by developing an up-to-date, flexible web-based system, and to provide for the eventual permanent archiving of this version of the SAMPEX data set at the NSSDC. Knowledgeable members of the SAMPEX science team have prepared the data, and members of the ACE Science Center at Caltech are involved in maintaining the data distribution pipeline and user interface. The system is modeled in part on the ACE Science Center, but enhanced to accommodate the more-complex SAMPEX data set. We will describe the current status of the SAMPEX Data Center, the user interface, and the contents of the data that are available.
The SAMPEX Data Center and User Interface for the SEC Community
NASA Astrophysics Data System (ADS)
Davis, A. J.; Mason, G. M.; Walpole, P.; von Rosenvinge, T. T.; Looper, M. D.; Blake, J. B.; Mazur, J. E.; Stone, E. C.; Leske, R. A.; Labrador, A. W.; Mewaldt, R. A.; Kanekal, S. G.; Baker, D. N.; Li, X.; Klecker, B.
2005-05-01
The Solar, Anomalous, Magnetospheric Particle Explorer (SAMPEX) was the first of NASA's Small Explorer (SMEX) series. SAMPEX was launched July 3, 1992 into a 520 by 670 km orbit at 82 degrees inclination. SAMPEX carries four instruments designed to study energetic particles of solar, interplanetary, and magnetospheric origin, as well as "anomalous" and galactic cosmic rays. As an outcome of the Senior Review process, the NASA SAMPEX science mission ended on June 30, 2004, leaving a 12-year continuous record of observations. (The spacecraft and instruments are still operating and returning science data for a 1-year trial period under a partnership between NASA and the Aerospace Corporation). SAMPEX was launched before the development of the WWW and implementation of NASA's open data policy. This, and the complexity of the data analysis have made it difficult for the general community to make full use of the SAMPEX science data set. The SAMPEX Data Center will remedy the situation. The data center set-up and operation is funded for 3 years by NASA. The goals of the data center are to enable community access to the full SAMPEX data set by developing an up-to-date, flexible web-based system, and to provide for the eventual permanent archiving of this version of the SAMPEX data set at the NSSDC. Knowledgeable members of the SAMPEX science team are preparing the data, and members of the ACE Science Center at Caltech are involved in developing the data distribution pipeline and user interface. The system is modeled in part on the ACE Science Center, but enhanced to accommodate the more-complex SAMPEX data set. We will describe the current status of the SAMPEX Data Center development, the user interface, and the contents of the data that will be made available.
NASA Astrophysics Data System (ADS)
Boler, F.; Meertens, C.
2012-04-01
The UNAVCO Data Center in Boulder, Colorado, archives for preservation and distributes geodesy data and products in the GNSS, InSAR, and LiDAR domains to the scientific and education community. The GNSS data, which in addition to geodesy are useful for tectonic, volcanologic, ice mass, glacial isostatic adjustment, meteorological and other studies, come from 2,500 continuously operating stations and 8000 survey-mode observation points around the globe that are operated by over 100 U.S. and international members of the UNAVCO consortium. SAR data, which are in many ways complementary to the GNSS data collection have been acquired in concert with the WInSAR Consortium activities and with EarthScope, with a focus on the western United States. UNAVCO also holds a growing collection of terrestrial laser scanning data. Several partner US geodesy data centers, along with UNAVCO, have developed and are in the process of implementing the Geodesy Seamless Archive Centers, a web services based technology to facilitate the exchange of metadata and delivery of data and products to users. These services utilize a repository layer implemented at each data center, and a service layer to identify and present any data center-specific services and capabilities, allowing simplified vertical federation of metadata from independent data centers. UNAVCO also has built web services for SAR data discovery and delivery, and will partner with other SAR data centers and institutions to provide access for the InSAR scientist to SAR data and ancillary data sets, web services to produce interferograms, and mechanisms to archive and distribute resulting higher level products. Improved access to LiDAR data from space-based, airborne, and terrestrial platforms through utilization of web services is similarly currently under development. These efforts in cyberinfrastructure, while initially aimed at intra-domain data sharing and providing products for research and education, are envisioned as potentially serving as the basis for leveraging integrated access across a broad set of Earth science domains.
A Semi-Automated Workflow Solution for Data Set Publication
Vannan, Suresh; Beaty, Tammy W.; Cook, Robert B.; ...
2016-03-08
In order to address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI) and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC), a NASA-funded data center, faces these challenges as it deals withmore » data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. Finally, the workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.« less
A Semi-Automated Workflow Solution for Data Set Publication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vannan, Suresh; Beaty, Tammy W.; Cook, Robert B.
In order to address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI) and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC), a NASA-funded data center, faces these challenges as it deals withmore » data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. Finally, the workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.« less
de Araújo, Juliana Sousa Soares; Regis, Cláudio Teixeira; Gomes, Renata Grigório Silva; Mourato, Felipe Alves; Mattos, Sandra da Silva
2016-12-01
To describe the incidence of congenital heart disease before and after the establishment of a telemedicine screening program, in a reference center from Northeast Brazil. This is a descriptive, retrospective and comparative study based on the institutional data from a reference center in perinatology for a period of 16 years. Institutional data were collected from a 16-year period (2001-15). Data were divided into two periods: prior to (2001-11) and after (2012-15) the establishment of a telemedicine screening program. After the implementation of the screening process, almost all kinds of heart disease showed a significant increase in their incidence (p < 0.05). With this, the incidence of major heart diseases approached those specified in developed regions. The implementation of a screening process model for congenital heart diseases can change the context of patients with congenital heart diseases in poor regions. © The Author [2016]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
MODIS land data at the EROS data center DAAC
Jenkerson, Calli B.; Reed, B.C.
2001-01-01
The US Geological Survey's (USGS) Earth Resources Observation Systems (EROS) Data Center (EDC) in Sioux Falls, SD, USA, is the primary national archive for land processes data and one of the National Aeronautics and Space Administration's (NASA) Distributed Active Archive Centers (DAAC) for the Earth Observing System (EOS). One of EDC's functions as a DAAC is the archival and distribution of Moderate Resolution Spectroradiometer (MODIS) Land Data collected from the Earth Observing System (EOS) satellite Terra. More than 500,000 publicly available MODIS land data granules totaling 25 Terabytes (Tb) are currently stored in the EDC archive. This collection is managed, archived, and distributed by EOS Data and Information System (EOSDIS) Core System (ECS) at EDC. EDC User Services support the use of MODIS Land data, which include land surface reflectance/albedo, temperature/emissivity, vegetation characteristics, and land cover, by responding to user inquiries, constructing user information sites on the EDC web page, and presenting MODIS materials worldwide.
GPS Position Time Series @ JPL
NASA Technical Reports Server (NTRS)
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
NASA Astrophysics Data System (ADS)
Berukoff, Steven; Reardon, Kevin; Hays, Tony; Spiess, DJ; Watson, Fraser
2015-08-01
When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 5 PB, produce 108 images, and 107-109 metadata elements annually. This data will not only forge new understanding of solar phenomena at high resolution, but enhance participation in solar physics and further grow a small but vibrant international community.The DKIST Data Center is being designed to store, curate, and process this flood of information, while augmenting its value by providing association of science data and metadata to its acquisition and processing provenance. In early Operations, the Data Center will produce, by autonomous, semi-automatic, and manual means, quality-controlled and -assured calibrated data sets, closely linked to facility and instrument performance during the Operations lifecycle. These data sets will be made available to the community openly and freely, and software and algorithms made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. In this budget-conscious era, a key design criterion is elasticity, the ability of the built system to adapt to changing work volumes, types, and the shifting scientific landscape, without undue cost or operational impact. We discuss our deep iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.
National Centers for Environmental Prediction
Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post streamline the interaction of analysis, forecast, and post-processing systems within NCEP. The NEMS Force, and will eventually provide support to the community through the Developmental Test Center (DTC
Examination of Data Accession at the National Snow and Ice Data Center
NASA Astrophysics Data System (ADS)
Scott, D. J.; Booker, L.
2017-12-01
The National Snow and Ice Data Center (NSIDC) stewards nearly 750 publicly available snow and ice data sets that support research into our world's frozen realms. NSIDC data management is primarily supported by the National Aeronautics and Space Administration (NASA), the National Science Foundation (NSF) and the National Oceanic and Atmospheric Administration (NOAA), and most of the data we archive and distribute is assigned to NSIDC through the funding agency programs. In addition to these mandates, NSIDC has historically offered data stewardship to researchers wanting to properly preserve and increase visibility of their research data under our primary programs (NASA, NSF, NOAA). With publishers now requiring researchers to deliver data to a repository prior to the publication of their data-related papers, we have seen an increase in researcher-initiated data accession requests. This increase is pushing us to reexamine our process to ensure timeliness in the acquisition and release of these data. In this presentation, we will discuss the support and value a researcher receives by submitting data to a trustworthy repository. We will examine NSIDC's data accession practices, and the challenges of a consistent process across NSIDC's multiple funding sponsors. Finally, we will share recent activities related to improving our process and ideas we have for enhancing the overall data accession experience.
Tools for Interdisciplinary Data Assimilation and Sharing in Support of Hydrologic Science
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Walker, J.; Suftin, I.; Warren, M.; Kunicki, T.
2013-12-01
Information consumed and produced in hydrologic analyses is interdisciplinary and massive. These factors put a heavy information management burden on the hydrologic science community. The U.S. Geological Survey (USGS) Office of Water Information Center for Integrated Data Analytics (CIDA) seeks to assist hydrologic science investigators with all-components of their scientific data management life cycle. Ongoing data publication and software development projects will be presented demonstrating publically available data access services and manipulation tools being developed with support from two Department of the Interior initiatives. The USGS-led National Water Census seeks to provide both data and tools in support of nationally consistent water availability estimates. Newly available data include national coverages of radar-indicated precipitation, actual evapotranspiration, water use estimates aggregated by county, and South East region estimates of streamflow for 12-digit hydrologic unit code watersheds. Web services making these data available and applications to access them will be demonstrated. Web-available processing services able to provide numerous streamflow statistics for any USGS daily flow record or model result time series and other National Water Census processing tools will also be demonstrated. The National Climate Change and Wildlife Science Center is a USGS center leading DOI-funded academic global change adaptation research. It has a mission goal to ensure data used and produced by funded projects is available via web services and tools that streamline data management tasks in interdisciplinary science. For example, collections of downscaled climate projections, typically large collections of files that must be downloaded to be accessed, are being published using web services that allow access to the entire dataset via simple web-service requests and numerous processing tools. Recent progress on this front includes, data web services for Climate Model Intercomparison Phase 5 based downscaled climate projections, EPA's Integrated Climate and Land Use Scenarios projections of population and land cover metrics, and MODIS-derived land cover parameters from NASA's Land Processes Distributed Active Archive Center. These new services and ways to discover others will be presented through demonstration of a recently open-sourced project from a web-application or scripted workflow. Development and public deployment of server-based processing tools to subset and summarize these and other data is ongoing at the CIDA with partner groups such as 52 Degrees North and Unidata. The latest progress on subsetting, spatial summarization to areas of interest, and temporal summarization via common-statistical methods will be presented.
NASA Astrophysics Data System (ADS)
Meyer, D. J.; Gallo, K. P.
2009-12-01
The NASA Earth Observation System (EOS) is a long-term, interdisciplinary research mission to study global-scale processes that drive Earth systems. This includes a comprehensive data and information system to provide Earth science researchers with easy, affordable, and reliable access to the EOS and other Earth science data through the EOS Data and Information System (EOSDIS). Data products from EOS and other NASA Earth science missions are stored at Distributed Active Archive Centers (DAACs) to support interactive and interoperable retrieval and distribution of data products. ¶ The Land Processes DAAC (LP DAAC), located at the US Geological Survey’s (USGS) Earth Resources Observation and Science (EROS) Center is one of the twelve EOSDIS data centers, providing both Earth science data and expertise, as well as a mechanism for interaction between EOS data investigators, data center specialists, and other EOS-related researchers. The primary mission of the LP DAAC is stewardship for land data products from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments on the Terra and Aqua observation platforms. The co-location of the LP DAAC at EROS strengthens the relationship between the EOSDIS and USGS Earth science activities, linking the basic research and technology development mission of NASA to the operational mission requirements of the USGS. This linkage, along with the USGS’ role as steward of land science data such as the Landsat archive, will prove to be especially beneficial when extending both USGS and EOSDIS data records into the Decadal Survey era. ¶ This presentation provides an overview of the evolution of LP DAAC efforts over the years to improve data discovery, retrieval and preparation services, toward a future of integrated data interoperability between EOSDIS data centers and data holdings of the USGS and its partner agencies. Historical developmental case studies are presented, including the MODIS Reprojection Tool (MRT), the scheduling of ASTER for emergency response, the inclusion of Landsat metadata in the EOS Clearinghouse (ECHO), and the distribution of a global digital elevation model (GDEM) developed from ASTER. A software re-use case study describes integrating the MRT and the USGS Global Visualization tool (GloVis) into the MRTWeb service, developed to provide on-the-fly reprojection and reformatting of MODIS land products. Current LP DAAC activities are presented, such as the Open geographic information systems (GIS) Consortium (OGC) services provided in support of NASA’s Making Earth Science Data Records for Use in Research Environments (MEaSUREs). Near-term opportunities are discussed, such as the design and development of services in support of the soon-to-be completed on-line archive of all LP DAAC ASTER and MODIS data products. Finally, several case studies for future tools are services are explored, such as bringing algorithms to data centers, using the North American ASTER Land Emissivity Database as an example, as well as the potential for integrating data discovery and retrieval services for LP DAAC, Landsat and USGS Long-term Archive holdings.
The improved broadband Real-Time Seismic Network in Romania
NASA Astrophysics Data System (ADS)
Neagoe, C.; Ionescu, C.
2009-04-01
Starting with 2002 the National Institute for Earth Physics (NIEP) has developed its real-time digital seismic network. This network consists of 96 seismic stations of which 48 broad band and short period stations and two seismic arrays are transmitted in real-time. The real time seismic stations are equipped with Quanterra Q330 and K2 digitizers, broadband seismometers (STS2, CMG40T, CMG 3ESP, CMG3T) and strong motions sensors Kinemetrics episensors (+/- 2g). SeedLink and AntelopeTM (installed on MARMOT) program packages are used for real-time (RT) data acquisition and exchange. The communication from digital seismic stations to the National Data Center in Bucharest is assured by 5 providers (GPRS, VPN, satellite communication, radio lease line and internet), which will assure the back-up communications lines. The processing centre runs BRTT's AntelopeTM 4.10 data acquisition and processing software on 2 workstations for real-time processing and post processing. The Antelope Real-Time System is also providing automatic event detection, arrival picking, event location and magnitude calculation. It provides graphical display and reporting within near-real-time after a local or regional event occurred. Also at the data center was implemented a system to collect macroseismic information using the internet on which macro seismic intensity maps are generated. In the near future at the data center will be install Seiscomp 3 data acquisition processing software on a workstation. The software will run in parallel with Antelope software as a back-up. The present network will be expanded in the near future. In the first half of 2009 NIEP will install 8 additional broad band stations in Romanian territory, which also will be transmitted to the data center in real time. The Romanian Seismic Network is permanently exchanging real -time waveform data with IRIS, ORFEUS and different European countries through internet. In Romania, magnitude and location of an earthquake are now available within a few minutes after the earthquake occurred. One of the greatest challenges in the near future is to provide shaking intensity maps and other ground motion parameters, within 5 minutes post-event, on the Internet and GIS-based format in order to improve emergency response, public information, preparedness and hazard mitigation
Evolving Metadata in NASA Earth Science Data Systems
NASA Astrophysics Data System (ADS)
Mitchell, A.; Cechini, M. F.; Walter, J.
2011-12-01
NASA's Earth Observing System (EOS) is a coordinated series of satellites for long term global observations. NASA's Earth Observing System Data and Information System (EOSDIS) is a petabyte-scale archive of environmental data that supports global climate change research by providing end-to-end services from EOS instrument data collection to science data processing to full access to EOS and other earth science data. On a daily basis, the EOSDIS ingests, processes, archives and distributes over 3 terabytes of data from NASA's Earth Science missions representing over 3500 data products ranging from various types of science disciplines. EOSDIS is currently comprised of 12 discipline specific data centers that are collocated with centers of science discipline expertise. Metadata is used in all aspects of NASA's Earth Science data lifecycle from the initial measurement gathering to the accessing of data products. Missions use metadata in their science data products when describing information such as the instrument/sensor, operational plan, and geographically region. Acting as the curator of the data products, data centers employ metadata for preservation, access and manipulation of data. EOSDIS provides a centralized metadata repository called the Earth Observing System (EOS) ClearingHouse (ECHO) for data discovery and access via a service-oriented-architecture (SOA) between data centers and science data users. ECHO receives inventory metadata from data centers who generate metadata files that complies with the ECHO Metadata Model. NASA's Earth Science Data and Information System (ESDIS) Project established a Tiger Team to study and make recommendations regarding the adoption of the international metadata standard ISO 19115 in EOSDIS. The result was a technical report recommending an evolution of NASA data systems towards a consistent application of ISO 19115 and related standards including the creation of a NASA-specific convention for core ISO 19115 elements. Part of NASA's effort to continually evolve its data systems led ECHO to enhancing the method in which it receives inventory metadata from the data centers to allow for multiple metadata formats including ISO 19115. ECHO's metadata model will also be mapped to the NASA-specific convention for ingesting science metadata into the ECHO system. As NASA's new Earth Science missions and data centers are migrating to the ISO 19115 standards, EOSDIS is developing metadata management resources to assist in the reading, writing and parsing ISO 19115 compliant metadata. To foster interoperability with other agencies and international partners, NASA is working to ensure that a common ISO 19115 convention is developed, enhancing data sharing capabilities and other data analysis initiatives. NASA is also investigating the use of ISO 19115 standards to encode data quality, lineage and provenance with stored values. A common metadata standard across NASA's Earth Science data systems promotes interoperability, enhances data utilization and removes levels of uncertainty found in data products.
Implications of acceleration environments on scaling materials processing in space to production
NASA Technical Reports Server (NTRS)
Demel, Ken
1990-01-01
Some considerations regarding materials processing in space are covered from a commercial perspective. Key areas include power, proprietary data, operational requirements (including logistics), and also the center of gravity location, and control of that location with respect to materials processing payloads.
Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home
Jaén, Carlos Roberto; Crabtree, Benjamin F.; Palmer, Raymond F.; Ferrer, Robert L.; Nutting, Paul A.; Miller, William L.; Stewart, Elizabeth E.; Wood, Robert; Davila, Marivel; Stange, Kurt C.
2010-01-01
PURPOSE Understanding the transformation of primary care practices to patient-centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country’s first national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration Project (NDP) was a group-randomized clinical trial of facilitated and self-directed implementation strategies for the PCMH. An independent evaluation team developed an integrated package of quantitative and qualitative methods to evaluate the process and outcomes of the NDP for practices and patients. Data were collected by an ethnographic analyst and a research nurse who visited each practice, and from multiple data sources including a medical record audit, patient and staff surveys, direct observation, interviews, and text review. Analyses aimed to provide real-time feedback to the NDP implementation team and lessons that would be transferable to the larger practice, policy, education, and research communities. RESULTS Real-time analyses and feedback appeared to be helpful to the facilitators. Medical record audits provided data on process-of-care outcomes. Patient surveys contributed important information about patient-rated primary care attributes and patient-centered outcomes. Clinician and staff surveys provided important practice experience and organizational data. Ethnographic observations supplied insights about the process of practice development. Most practices were not able to provide detailed financial information. CONCLUSIONS A multimethod approach is challenging, but feasible and vital to understanding the process and outcome of a practice development process. Additional longitudinal follow-up of NDP practices and their patients is needed. PMID:20530398
NASA Astrophysics Data System (ADS)
Xi, Lei; Guo, Wei; Che, Yinchao; Zhang, Hao; Wang, Qiang; Ma, Xinming
To solve problems in detecting the origin of agricultural products, this paper brings about an embedded data-based terminal, applies middleware thinking, and provides reusable long-range two-way data exchange module between business equipment and data acquisition systems. The system is constructed by data collection node and data center nodes. Data collection nodes taking embedded data terminal NetBoxII as the core, consisting of data acquisition interface layer, controlling information layer and data exchange layer, completing the data reading of different front-end acquisition equipments, and packing the data TCP to realize the data exchange between data center nodes according to the physical link (GPRS / CDMA / Ethernet). Data center node consists of the data exchange layer, the data persistence layer, and the business interface layer, which make the data collecting durable, and provide standardized data for business systems based on mapping relationship of collected data and business data. Relying on public communications networks, application of the system could establish the road of flow of information between the scene of origin certification and management center, and could realize the real-time collection, storage and processing between data of origin certification scene and databases of certification organization, and could achieve needs of long-range detection of agricultural origin.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheppy, M.; Lobato, C.; Van Geet, O.
2011-12-01
This publication detailing the design, implementation strategies, and continuous performance monitoring of NREL's Research Support Facility data center. Data centers are energy-intensive spaces that facilitate the transmission, receipt, processing, and storage of digital data. These spaces require redundancies in power and storage, as well as infrastructure, to cool computing equipment and manage the resulting waste heat (Tschudi, Xu, Sartor, and Stein, 2003). Data center spaces can consume more than 100 times the energy of standard office spaces (VanGeet 2011). The U.S. Environmental Protection Agency (EPA) reported that data centers used 61 billion kilowatt-hours (kWh) in 2006, which was 1.5% ofmore » the total electricity consumption in the U.S. (U.S. EPA, 2007). Worldwide, data centers now consume more energy annually than Sweden (New York Times, 2009). Given their high energy consumption and conventional operation practices, there is a potential for huge energy savings in data centers. The National Renewable Energy Laboratory (NREL) is world renowned for its commitment to green building construction. In June 2010, the laboratory finished construction of a 220,000-square-foot (ft{sup 2}), LEED Platinum, Research Support Facility (RSF), which included a 1,900-ft{sup 2} data center. The RSF will expand to 360,000 ft{sup 2} with the opening of an additional wing December, 2011. The project's request for proposals (RFP) set a whole-building demand-side energy use requirement of a nominal 35 kBtu/ft{sup 2} per year. On-site renewable energy generation will offset the annual energy consumption. To support the RSF's energy goals, NREL's new data center was designed to minimize its energy footprint without compromising service quality. Several implementation challenges emerged during the design, construction, and first 11 months of operation of the RSF data center. This document highlights these challenges and describes in detail how NREL successfully overcame them. The IT settings and strategies outlined in this document have been used to significantly reduce data center energy requirements in the RSF; however, these can also be used in existing buildings and retrofits.« less
A Process for Comparing Dynamics of Distributed Space Systems Simulations
NASA Technical Reports Server (NTRS)
Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.
2009-01-01
The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.
A framework to monitor activities of satellite data processing in real-time
NASA Astrophysics Data System (ADS)
Nguyen, M. D.; Kryukov, A. P.
2018-01-01
Space Monitoring Data Center (SMDC) of SINP MSU is one of the several centers in the world that collects data on the radiational conditions in near-Earth orbit from various Russian (Lomonosov, Electro-L1, Electro-L2, Meteor-M1, Meteor-M2, etc.) and foreign (GOES 13, GOES 15, ACE, SDO, etc.) satellites. The primary purposes of SMDC are: aggregating heterogeneous data from different sources; providing a unified interface for data retrieval, visualization, analysis, as well as development and testing new space weather models; and controlling the correctness and completeness of data. Space weather models rely on data provided by SMDC to produce forecasts. Therefore, monitoring the whole data processing cycle is crucial for further success in the modeling of physical processes in near-Earth orbit based on the collected data. To solve the problem described above, we have developed a framework called Live Monitor at SMDC. Live Monitor allows watching all stages and program components involved in each data processing cycle. All activities of each stage are logged by Live Monitor and shown in real-time on a web interface. When an error occurs, a notification message will be sent to satellite operators via email and the Telegram messenger service so that they could take measures in time. The Live Monitor’s API can be used to create a customized monitoring service with minimum coding.
Ionospheric characteristics for archiving at the World Data Centers. Technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamache, R.R.; Reinisch, B.W.
1990-12-01
A database structure for archiving ionospheric characteristics at uneven data rates was developed at the July 1989 Ionospheric Informatics Working Group (IIWG) Lowell Workshop in Digital Ionogram Data Formats for World Data Center Archiving. This structure is proposed as a new URSI standard and is being employed by the World Data Center A for solar terrestrial physics for archiving characteristics. Here the database has been slightly refined for the application and programs written to generate these database files using as input Digisonde 256 ARTIST data, post processed by the ULCAR ADEP (ARTIST Data Editing Program) system. The characteristics program asmore » well as supplemental programs developed for this task are described here. The new software will make it possible to archive the ionospheric characteristics from the Geophysics Laboratory high latitude Digisonde network, the AWS DISS and the international Digisonde networks, and other ionospheric sounding networks.« less
The McDonald Observatory lunar laser ranging project
NASA Technical Reports Server (NTRS)
Silverberg, E. C.
1978-01-01
A summary of the activities of the McDonald lunar laser ranging station at Fort Davis for the FY 77-78 fiscal year is presented. The lunar laser experiment uses the observatory 2.7m reflecting telescope on a thrice-per-day, 21-day-per-lunation schedule. Data are recorded on magnetic tapes and sent to the University of Texas at Austin where the data is processed. After processing, the data is distributed to interested analysis centers and later to the National Space Science Data Center where it is available for routine distribution. Detailed reports are published on the McDonald operations after every fourth lunation or approximately once every 115 days. These reports contain a day-by-day documentation of the ranging activity, detailed discussions of the equipment development efforts, and an abundance of other information as is needed to document and archive this important data type.
User Interface Models for Multidisciplinary Bibliographic Information Dissemination Centers.
ERIC Educational Resources Information Center
Zipperer, W. C.
Two information dissemination centers at University of California at Los Angeles and University of Georgia studied the interactions between computer based search facilities and their users. The study, largely descriptive in nature, investigated the interaction processes between data base users and profile analysis or information specialists in…
Data Curation Education in Research Centers (DCERC)
NASA Astrophysics Data System (ADS)
Marlino, M. R.; Mayernik, M. S.; Kelly, K.; Allard, S.; Tenopir, C.; Palmer, C.; Varvel, V. E., Jr.
2012-12-01
Digital data both enable and constrain scientific research. Scientists are enabled by digital data to develop new research methods, utilize new data sources, and investigate new topics, but they also face new data collection, management, and preservation burdens. The current data workforce consists primarily of scientists who receive little formal training in data management and data managers who are typically educated through on-the-job training. The Data Curation Education in Research Centers (DCERC) program is investigating a new model for educating data professionals to contribute to scientific research. DCERC is a collaboration between the University of Illinois at Urbana-Champaign Graduate School of Library and Information Science, the University of Tennessee School of Information Sciences, and the National Center for Atmospheric Research. The program is organized around a foundations course in data curation and provides field experiences in research and data centers for both master's and doctoral students. This presentation will outline the aims and the structure of the DCERC program and discuss results and lessons learned from the first set of summer internships in 2012. Four masters students participated and worked with both data mentors and science mentors, gaining first hand experiences in the issues, methods, and challenges of scientific data curation. They engaged in a diverse set of topics, including climate model metadata, observational data management workflows, and data cleaning, documentation, and ingest processes within a data archive. The students learned current data management practices and challenges while developing expertise and conducting research. They also made important contributions to NCAR data and science teams by evaluating data management workflows and processes, preparing data sets to be archived, and developing recommendations for particular data management activities. The master's student interns will return in summer of 2013, and two Ph.D. students will conduct data curation-related dissertation fieldwork during the 2013-2014 academic year.
Marbach-Ad, Gili; Hunt Rietschel, Carly
2016-01-01
In this study, we used a case study approach to obtain an in-depth understanding of the change process of two university instructors who were involved with redesigning a biology course. Given the hesitancy of many biology instructors to adopt evidence-based, learner-centered teaching methods, there is a critical need to understand how biology instructors transition from teacher-centered (i.e., lecture-based) instruction to teaching that focuses on the students. Using the innovation-decision model for change, we explored the motivation, decision-making, and reflective processes of the two instructors through two consecutive, large-enrollment biology course offerings. Our data reveal that the change process is somewhat unpredictable, requiring patience and persistence during inevitable challenges that arise for instructors and students. For example, the change process requires instructors to adopt a teacher-facilitator role as opposed to an expert role, to cover fewer course topics in greater depth, and to give students a degree of control over their own learning. Students must adjust to taking responsibility for their own learning, working collaboratively, and relinquishing the anonymity afforded by lecture-based teaching. We suggest implications for instructors wishing to change their teaching and administrators wishing to encourage adoption of learner-centered teaching at their institutions. PMID:27856550
Solid earth geophysics: Data services
NASA Astrophysics Data System (ADS)
1987-01-01
The National Oceanic and Atmospheric Administration (NOAA) collects, manages, and disseminates many kinds of scientific data that result from the inquiry into the environment. The National Geophysical Data Center (NGDC), one of the several data-management centers of NOAA, is responsible for data activities in the fields of seismology, gravity, topography, geomagnetism, geothermics, marine geology and geophysics, and solar-terrestrial physics. The pamphlet briefly describes the principal products and services NGDC provides through its Solid Earth (SEG) division. Among the most important activities of SEG are acquiring and archiving data, processing and formatting data into standard sets, developing useful data products for customers, and advertising and disseminating data to the scientific, academic, and industrial communities.
Pen-based computers: Computers without keys
NASA Technical Reports Server (NTRS)
Conklin, Cheryl L.
1994-01-01
The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.
Hawk, Ernest T; Habermann, Elizabeth B; Ford, Jean G; Wenzel, Jennifer A; Brahmer, Julie R; Chen, Moon S; Jones, Lovell A; Hurd, Thelma C; Rogers, Lisa M; Nguyen, Lynne H; Ahluwalia, Jasjit S; Fouad, Mona; Vickers, Selwyn M
2014-04-01
To ensure that National Institutes of Health-funded research is relevant to the population's needs, specific emphasis on proportional representation of minority/sex groups into National Cancer Institute (NCI) cancer centers' clinical research programs is reported to the NCI. EMPaCT investigators at 5 regionally diverse comprehensive cancer centers compared data reported to the NCI for their most recent Cancer Center Support Grant competitive renewal to assess and compare the centers' catchment area designations, data definitions, data elements, collection processes, reporting, and performance regarding proportional representation of race/ethnicity and sex subsets. Cancer centers' catchment area definitions differed widely in terms of their cancer patient versus general population specificity, levels of specificity, and geographic coverage. Racial/ethnic categories were similar, yet were defined differently, across institutions. Patients' socioeconomic status and insurance status were inconsistently captured across the 5 centers. Catchment area definitions and the collection of patient-level demographic factors varied widely across the 5 comprehensive cancer centers. This challenged the assessment of success by cancer centers in accruing representative populations into the cancer research enterprise. Accrual of minorities was less than desired for at least 1 racial/ethnic subcategory at 4 of the 5 centers. Institutions should clearly and consistently declare their primary catchment area and the rationale and should report how race/ethnicity and sex are defined, determined, collected, and reported. More standardized, frequent, consistent collection, reporting, and review of these data are recommended, as is a commitment to collecting socioeconomic data, given that socioeconomic status is a primary driver of cancer disparities in the United States. © 2014 American Cancer Society.
Toolsets for Airborne Data (TAD): Improving Machine Readability for ICARTT Data Files
NASA Technical Reports Server (NTRS)
Early, Amanda Benson; Beach, Aubrey; Northup, Emily; Wang, Dali; Kusterer, John; Quam, Brandi; Chen, Gao
2015-01-01
The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the ingest, archive, and distribution of NASA Earth Science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The ASDC specializes in atmospheric data that is important to understanding the causes and processes of global climate change and the consequences of human activities on the climate. The ASDC currently supports more than 44 projects and has over 1,700 archived data sets, which increase daily. ASDC customers include scientists, researchers, federal, state, and local governments, academia, industry, and application users, the remote sensing community, and the general public.
2014-06-01
and Coastal Data Information Program ( CDIP ). This User’s Guide includes step-by-step instructions for accessing the GLOS/GLCFS database via WaveNet...access, processing and analysis tool; part 3 – CDIP database. ERDC/CHL CHETN-xx-14. Vicksburg, MS: U.S. Army Engineer Research and Development Center
NASA Technical Reports Server (NTRS)
Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill
1992-01-01
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.
WFIRST Science Operations at STScI
NASA Astrophysics Data System (ADS)
Gilbert, Karoline; STScI WFIRST Team
2018-06-01
With sensitivity and resolution comparable the Hubble Space Telescope, and a field of view 100 times larger, the Wide Field Instrument (WFI) on WFIRST will be a powerful survey instrument. STScI will be the Science Operations Center (SOC) for the WFIRST Mission, with additional science support provided by the Infrared Processing and Analysis Center (IPAC) and foreign partners. STScI will schedule and archive all WFIRST observations, calibrate and produce pipeline-reduced data products for imaging with the Wide Field Instrument, support the High Latitude Imaging and Supernova Survey Teams, and support the astronomical community in planning WFI imaging observations and analyzing the data. STScI has developed detailed concepts for WFIRST operations, including a data management system integrating data processing and the archive which will include a novel, cloud-based framework for high-level data processing, providing a common environment accessible to all users (STScI operations, Survey Teams, General Observers, and archival investigators). To aid the astronomical community in examining the capabilities of WFIRST, STScI has built several simulation tools. We describe the functionality of each tool and give examples of its use.
Hajiebrahimi, Zahra; Mahmoodi, Ghahraman; Abedi, Ghasem
2017-01-01
Health-care service processes need to be assessed over time. We aimed to assess the breast cancer care process in primary health system of Golestan Province, North Iran. To perform a descriptive cross-sectional study, information on breast cancer care processes in primary health-care system was collected using a "collecting form" from 234 health houses, 29 health posts, 44 urban health centers, and 80 rural health centers in Golestan Province. Registered data in the centers and patients' journal were used in data collection. Moreover, we collected data on all women who were diagnosed with breast cancer in 2014 to know the characteristics of the patients. Around 50% of health workers at rural or urban area were trained on breast cancer. Moreover, 2% of women from general population in rural area and around 6% of them in urban area have been trained on breast cancer. Mean age of women diagnosed with breast cancer was 48 ± 10 years and 40.2% of them were affected at age between 43 and 52 years. The results showed that 18.9% of women have received their information through self-study before the diagnosis of breast cancer while 53.8% of them received their information from the private clinics after diagnosis of breast cancer. The process of breast cancer care in Golestan Province needs to be improved in the primary health-care level. Both inter- and multi-disciplinary activities are needed.
Easy access to geophysical data sets at the IRIS Data Management Center
NASA Astrophysics Data System (ADS)
Trabant, C.; Ahern, T.; Suleiman, Y.; Karstens, R.; Weertman, B.
2012-04-01
At the IRIS Data Management Center (DMC) we primarily manage seismological data but also have other geophysical data sets for related fields including atmospheric pressure and gravity measurements and higher level data products derived from raw data. With a few exceptions all data managed by the IRIS DMC are openly available and we serve an international research audience. These data are available via a number of different mechanisms from batch requests submitted through email, web interfaces, near real time streams and more recently web services. Our initial suite of web services offer access to almost all of the raw data and associated metadata managed at the DMC. In addition, we offer services that apply processing to the data before it is sent to the user. Web service technologies are ubiquitous with support available in nearly every programming language and operating system. By their nature web services are programmatic interfaces, but by choosing a simple subset of web service methods we make our data available to a very broad user base. These interfaces will be usable by professional developers as well as non-programmers. Whenever possible we chose open and recognized standards. The data returned to the user is in a variety of formats depending on type, including FDSN SEED, QuakeML, StationXML, ASCII, PNG images and in some cases where no appropriate standard could be found a customized XML format. To promote easy access to seismological data for all researchers we are coordinating with international partners to define web service interfaces standards. Additionally we are working with key partners in Europe to complete the initial implementation of these services. Once a standard has been adopted and implemented at multiple data centers researchers will be able to use the same request tools to access data across multiple data centers. The web services that apply on-demand processing to requested data include the capability to apply instrument corrections and format translations which ultimately allows more researchers to use the data without knowledge of specific data and metadata formats. In addition to serving as a new platform on top of which research scientists will build advanced processing tools we anticipate that they will result in more data being accessible by more users.
78 FR 32255 - HHS-Operated Risk Adjustment Data Validation Stakeholder Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-29
...-Operated Risk Adjustment Data Validation Stakeholder Meeting AGENCY: Centers for Medicare & Medicaid... Act HHS-operated risk adjustment data validation process. The purpose of this public meeting is to... interested parties about key HHS policy considerations pertaining to the HHS-operated risk adjustment data...
Impact assessment of GPS radio occultation data on Antarctic analysis and forecast using WRF 3DVAR
NASA Astrophysics Data System (ADS)
Zhang, H.; Wee, T. K.; Liu, Z.; Lin, H. C.; Kuo, Y. H.
2016-12-01
This study assesses the impact of Global Positioning System (GPS) Radio Occultation (RO) refractivity data on the analysis and forecast in the Antarctic region. The RO data are continuously assimilated into the Weather Research and Forecasting (WRF) Model using the WRF 3DVAR along with other observations that were operationally available to the National Center for Environmental Prediction (NCEP) during a month period, October 2010, including the Advance Microwave Sounding Unit (AMSU) radiance data. For the month-long data assimilation experiments, three RO datasets are used: 1) The actual operational dataset, which was produced by the near real-time RO processing at that time and provided to weather forecasting centers; 2) a post-processed dataset with posterior clock and orbit estimates, and with improved RO processing algorithms; and, 3) another post-processed dataset, produced with a variational RO processing. The data impact is evaluated with comparing the forecasts and analyses to independent driftsonde observations that are made available through the Concordiasi field campaign, in addition to utilizing other traditional means of verification. A denial of RO data (while keeping all other observations) resulted in a remarkable quality degradation of analysis and forecast, indicating the high value of RO data over the Antarctic area. The post-processed RO data showed a significantly larger positive impact compared to the near real-time data, due to extra RO data from the TerraSAR-X satellite (unavailable at the time of the near real-time processing) as well as the supposedly improved data quality as a result of the post-processing. This strongly suggests that the future polar constellation of COSMIC-2 is vital. The variational RO processing further reduced the systematic and random errors in both analysis and forecasts, for instance, leading to a smaller background departure of AMSU radiance. This indicates that the variational RO processing provides an improved reference for the bias correction of satellite radiance, making the bias correction more effective. This study finds that advanced RO data processing algorithms may further enhance the high quality of RO data in high Southern latitudes.
Global Change Data Center: Mission, Organization, Major Activities, and 2001 Highlights
NASA Technical Reports Server (NTRS)
Wharton, Stephen W. (Technical Monitor)
2002-01-01
Rapid efficient access to Earth sciences data is fundamental to the Nation's efforts to understand the effects of global environmental changes and their implications for public policy. It becomes a bigger challenge in the future when data volumes increase further and missions with constellations of satellites start to appear. Demands on data storage, data access, network throughput, processing power, and database and information management are increased by orders of magnitude, while budgets remain constant and even shrink. The Global Change Data Center's (GCDC) mission is to provide systems, data products, and information management services to maximize the availability and utility of NASA's Earth science data. The specific objectives are (1) support Earth science missions be developing and operating systems to generate, archive, and distribute data products and information; (2) develop innovative information systems for processing, archiving, accessing, visualizing, and communicating Earth science data; and (3) develop value-added products and services to promote broader utilization of NASA Earth Sciences Enterprise (ESE) data and information. The ultimate product of GCDC activities is access to data and information to support research, education, and public policy.
A proto-Data Processing Center for LISA
NASA Astrophysics Data System (ADS)
Cavet, Cécile; Petiteau, Antoine; Le Jeune, Maude; Plagnol, Eric; Marin-Martholaz, Etienne; Bayle, Jean-Baptiste
2017-05-01
The LISA project preparation requires to study and define a new data analysis framework, capable of dealing with highly heterogeneous CPU needs and of exploiting the emergent information technologies. In this context, a prototype of the mission’s Data Processing Center (DPC) has been initiated. The DPC is designed to efficiently manage computing constraints and to offer a common infrastructure where the whole collaboration can contribute to development work. Several tools such as continuous integration (CI) have already been delivered to the collaboration and are presently used for simulations and performance studies. This article presents the progress made regarding this collaborative environment and discusses also the possible next steps towards an on-demand computing infrastructure. This activity is supported by CNES as part of the French contribution to LISA.
An AK-LDMeans algorithm based on image clustering
NASA Astrophysics Data System (ADS)
Chen, Huimin; Li, Xingwei; Zhang, Yongbin; Chen, Nan
2018-03-01
Clustering is an effective analytical technique for handling unmarked data for value mining. Its ultimate goal is to mark unclassified data quickly and correctly. We use the roadmap for the current image processing as the experimental background. In this paper, we propose an AK-LDMeans algorithm to automatically lock the K value by designing the Kcost fold line, and then use the long-distance high-density method to select the clustering centers to further replace the traditional initial clustering center selection method, which further improves the efficiency and accuracy of the traditional K-Means Algorithm. And the experimental results are compared with the current clustering algorithm and the results are obtained. The algorithm can provide effective reference value in the fields of image processing, machine vision and data mining.
Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution
Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.
1987-01-01
Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.
The Future is Hera: Analyzing Astronomical Data Over the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Snowden, S.; Chai, P.; Shafer, R.
2009-01-01
Hera is the new data processing facility provided by the HEASARC at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the preinstalled software packages, local disk space, and computing resources needed to do general processing of FITS format data files residing on the user's local computer, and to do advanced research using the publicly available data from High Energy Astrophysics missions. Qualified students, educators, and researchers may freely use the Hera services over the internet for research and educational purposes.
NASA Technical Reports Server (NTRS)
1979-01-01
NASA computerized image processing techniques are an integral part of a cardiovascular data bank at Duke University Medical Center. Developed by Dr. C. F. Starmer and colleagues at Duke, the data bank documents the Center's clinical experience with more than 4,000 heart patients as an aid to diagnosis and treatment of heart disease. Data is stored in a computerized system that allows a physician to summon detailed records of former patients whose medical profiles are similar to those of a new patient. A video display (photo) and printed report shows prognostic information for the new patient based on similar past experience.
Design of a Mission Data Storage and Retrieval System for NASA Dryden Flight Research Center
NASA Technical Reports Server (NTRS)
Lux, Jessica; Downing, Bob; Sheldon, Jack
2007-01-01
The Western Aeronautical Test Range (WATR) at the NASA Dryden Flight Research Center (DFRC) employs the WATR Integrated Next Generation System (WINGS) for the processing and display of aeronautical flight data. This report discusses the post-mission segment of the WINGS architecture. A team designed and implemented a system for the near- and long-term storage and distribution of mission data for flight projects at DFRC, providing the user with intelligent access to data. Discussed are the legacy system, an industry survey, system operational concept, high-level system features, and initial design efforts.
Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A
2016-01-01
The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar The Mesoscale Modeling Branch conducts a program of research and development in support of the prediction. This research and development includes mesoscale four-dimensional data assimilation of domestic
Effective trauma center partnerships to address firearm injury: a new paradigm.
Richmond, Therese S; Schwab, C William; Riely, Jeaneen; Branas, Charles C; Cheney, Rose; Dunfey, Maura
2004-06-01
Firearm violence is the second leading cause of injury-related death. This study examined the use of local trauma centers as lead organizations in their communities to address firearm injury. Three trauma centers in cities with populations less than 100,000 were linked with a university-based firearm injury research center. A trauma surgeon director and coordinator partnered with communities, recruited and directed advisory boards, established a local firearm injury surveillance system, and informed communities using community-specific profiles. Primary process and outcome measures included completeness of data, development of community-specific profiles, number of data-driven consumer media pieces, number of meetings to inform policy makers, and an analysis of problems encountered. Local trauma centers in smaller communities implemented a firearm injury surveillance system, produced community-specific injury profiles, and engaged community leaders and policy makers to address firearm injury. Community-specific profiles demonstrated consistent firearm suicide rates (6.58-6.82 per 100,000) but variation in firearm homicide rates (1.08-12.5 per 100,000) across sites. There were 63 data-driven media pieces and 18 forums to inform community leaders and policy makers. Completeness of data elements ranged from 57.1% to 100%. Problems experienced were disconnected data sources, multiple data owners, potential for political fallout, limited trauma center data, skills sets of medical professionals, and sustainability. Trauma centers, when provided resources and support, with the model described, can function as lead organizations in partnering with the community to acquire and use community-specific data for local firearm injury prevention.
TLALOCNet: A Continuous GPS-Met Array in Mexico for Seismotectonic and Atmospheric Research
NASA Astrophysics Data System (ADS)
Cabral-Cano, E.; Salazar-Tlaczani, L.; Galetzka, J.; DeMets, C.; Serra, Y. L.; Feaux, K.; Mattioli, G. S.; Miller, M. M.
2015-12-01
TLALOCNet is a network of continuous Global Positioning System (cGPS) and meteorology stations in Mexico for the interrogation of the earthquake cycle, tectonic processes, land subsidence, and atmospheric processes of Mexico. Once completed, TLALOCNet will span all of Mexico and will link existing GPS infrastructure in North America and the Caribbean aiming towards creating a continuous, federated network of networks in the Americas. Phase 1 (2014-2015), funded by NSF and UNAM, is building and upgrading 30+ cGPS-Met sites to the high standard of the EarthScope Plate Boundary Observatory (PBO). Phase 2 (2016) will add ~25 more cGPS-Met stations to be funded through CONACyT. TLALOCNet provides open and freely available raw GPS data, GPS-PWV, surface meteorology measurements, time series of daily positions, as well as a station velocity field to support a broad range of geoscience investigations. This is accomplished through the development of the TLALOCNet data center (http://tlalocnet.udg.mx) that serves as a collection and distribution point. This data center is based on UNAVCO's Dataworks-GSAC software and can work as part of UNAVCO's seamless archive for discovery, sharing, and access to data.The TLALOCNet data center also contains contributed data from several regional networks in Mexico. By using the same protocols and structure as the UNAVCO and other COCONet regional data centers, the geodetic community has the capability of accessing data from a large number of scientific and academically operated Mexican GPS sites. This archive provides a fully querable and scriptable GPS and Meteorological data retrieval point. Additionally Real-time 1Hz streams from selected TLALOCNet stations are available in BINEX, RTCM 2.3 and RTCM 3.1 formats via the Networked Transport of RTCM via Internet Protocol (NTRIP).
NASA Technical Reports Server (NTRS)
Kempler, Steve; Alcott, Gary; Lynnes, Chris; Leptoukh, Greg; Vollmer, Bruce; Berrick, Steve
2008-01-01
NASA Earth Sciences Division (ESD) has made great investments in the development and maintenance of data management systems and information technologies, to maximize the use of NASA generated Earth science data. With information management system infrastructure in place, mature and operational, very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) and the reusability for these future missions. The GES DISC has developed a series of modular, reusable data management components currently in use. They include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. Information management system components are based on atmospheric scientist inputs. Large development and maintenance cost savings can be realized through their reuse in future missions.
MABEL at IPAC: managing address books and email lists at the Infrared Processing and Analysis Center
NASA Astrophysics Data System (ADS)
Crane, Megan; Brinkworth, Carolyn; Gelino, Dawn; O'Leary, Ellen
2012-09-01
The Infrared Processing and Analysis Center (IPAC), located on the campus of the California Institute of Technology, is NASA's multi-mission data center for infrared astrophysics. Some of IPAC's services include administering data analysis funding awards to the astronomical community, organizing conferences and workshops, and soliciting and selecting fellowship and observing proposals. As most of these services are repeated annually or biannually, it becomes necessary to maintain multiple lists of email contacts associated with each service. MABEL is a PHP/MySQL web database application designed to facilitate this process. It serves as an address book containing up-to-date contact information for thousands of recipients. Recipients may be assigned to any number of email lists categorized by IPAC project and team. Lists may be public (viewable by all project members) or private (viewable only by team members). MABEL can also be used to send HTML or plain-text emails to multiple lists at once and prevents duplicate emails to a single recipient. This work was performed at the California Institute of Technology under contract to the National Aeronautics and Space Administration.
NASA Astrophysics Data System (ADS)
Caesarendra, W.; Kosasih, B.; Tjahjowidodo, T.; Ariyanto, M.; Daryl, LWQ; Pamungkas, D.
2018-04-01
Rapid and reliable information in slew bearing maintenance is not trivial issue. This paper presents the online monitoring system to assist maintenance engineer in order to monitor the bearing condition of low speed slew bearing in sheet metal company. The system is able to pass the vibration information from the place where the bearing and accelerometer sensors are attached to the data center; and from the data center it can be access by opening the online monitoring website from any place and by any person. The online monitoring system is built using some programming languages such as C language, MATLAB, PHP, HTML and CSS. Generally, the flow process is start with the automatic vibration data acquisition; then features are calculated from the acquired vibration data. These features are then sent to the data center; and form the data center, the vibration features can be seen through the online monitoring website. This online monitoring system has been successfully applied in School of Mechanical, Materials and Mechatronic Engineering, University of Wollongong.
GDAL Enhancements for Interoperability with EOS Data (GEE)
NASA Astrophysics Data System (ADS)
Tisdale, B.
2015-12-01
Historically, Earth Observing Satellite (EOS) data products have been difficult to consume by GIS tools, weather commercial or open-source. This has resulted in a reduced acceptance of these data products by GIS and general user communities. Common problems and challenges experienced by these data users include difficulty when: Consuming data products from NASA Distributed Active Archive Centers (DAACs) that pre-date modern application software with commercial and open-source geospatial tools; Identifying[MI1] an initial approach for developing a framework and plug-ins that interpret non-compliant data; Defining a methodology that is extensible across NASA Earth Observing System Data and Information System (EOSDIS), scientific communities, and GIS communities by enabling other data centers to construct their own plug-ins and adjust specific data products; and Promoting greater use of NASA Data and new analysis utilizing GIS tools. To address these challenges and make EOS data products more accessible and interpretable by GIS applications, a collaborative approach has been taken that includes the NASA Langley Atmospheric Science Data Center (ASDC), Esri, George Mason University (GMU), and the Hierarchical Data Format (HDF) Group to create a framework and plugins to be applied to Geospatial Data Abstraction Library (GDAL). This framework and its plugins offer advantages of extensibility within NASA EOSDIS, permitting other data centers to construct their own plugins necessary to adjust their data products. In this session findings related to the framework and the development of GDAL plugins will be reviewed. Specifically, this session will offer a workshop to review documentation and training materials that have been generated for the purpose of guiding other NASA DAACs through the process of constructing plug-ins consistent with the framework as well as a review of the certification process by which the plugins can be independently verified as properly converting the data to the format and content required for use in GIS software.
GDAL Enhancements for Interoperability with EOS Data
NASA Astrophysics Data System (ADS)
Tisdale, M.; Mathews, T. J.; Tisdale, B.; Sun, M.; Yang, C. P.; Lee, H.; Habermann, T.
2015-12-01
Historically, Earth Observing Satellite (EOS) data products have been difficult to consume by GIS tools, weather commercial or open-source. This has resulted in a reduced acceptance of these data products by GIS and general user communities. Common problems and challenges experienced by these data users include difficulty when: Consuming data products from NASA Distributed Active Archive Centers (DAACs) that pre-date modern application software with commercial and open-source geospatial tools; Identifying an initial approach for developing a framework and plug-ins that interpret non-compliant data; Defining a methodology that is extensible across NASA Earth Observing System Data and Information System (EOSDIS), scientific communities, and GIS communities by enabling other data centers to construct their own plug-ins and adjust specific data products; and Promoting greater use of NASA Data and new analysis utilizing GIS tools. To address these challenges and to make EOS data products more accessible and interpretable by GIS applications, a collaborative approach has been taken that includes the NASA Langley Atmospheric Science Data Center (ASDC), Esri, George Mason University (GMU), and the Hierarchical Data Format (HDF) Group to create a framework and plugins to be applied to Geospatial Data Abstraction Library (GDAL). This framework and its plugins offer advantages of extensibility within NASA EOSDIS, permitting other data centers to construct their own plugins necessary to adjust their data products. In this session findings related to the framework and the development of GDAL plugins will be reviewed. Specifically, this session will offer a workshop to review documentation and training materials that have been generated for the purpose of guiding other NASA DAACs through the process of constructing plug-ins consistent with the framework as well as a review of the certification process by which the plugins can be independently verified as properly converting the data to the format and content required for use in GIS software.
Geocoded data structures and their applications to Earth science investigations
NASA Technical Reports Server (NTRS)
Goldberg, M.
1984-01-01
A geocoded data structure is a means for digitally representing a geographically referenced map or image. The characteristics of representative cellular, linked, and hybrid geocoded data structures are reviewed. The data processing requirements of Earth science projects at the Goddard Space Flight Center and the basic tools of geographic data processing are described. Specific ways that new geocoded data structures can be used to adapt these tools to scientists' needs are presented. These include: expanding analysis and modeling capabilities; simplifying the merging of data sets from diverse sources; and saving computer storage space.
Surface-Water Quality-Assurance Plan for the USGS Wisconsin Water Science Center
Garn, H.S.
2007-01-01
This surface-water quality-assurance plan documents the standards, policies, and procedures used by the Wisconsin Water Science Center of the U.S. Geological Survey, Water Resources Discipline, for activities related to the collection, processing, storage, analysis, management, and publication of surface-water data. The roles and responsibilities of Water Science Center personnel in following these policies and procedures including those related to safety and training are presented.
NIMH Prototype Management Information System for Community Mental Health Centers
Wurster, Cecil R.; Goodman, John D.
1980-01-01
Various approaches to centralized support of computer applications in health care are described. The NIMH project to develop a prototype Management Information System (MIS) for community mental health centers is presented and discussed as a centralized development of an automated data processing system for multiple user organizations. The NIMH program is summarized, the prototype MIS is characterized, and steps taken to provide for the differing needs of the mental health centers are highlighted.
Development of Medical Technology for Contingency Response to Marrow Toxic Agents
1. Contingency Preparedness: Collect information from transplant centers, build awareness of the Transplant Center Contingency Planning Committee and...Matched Donors: Increase operational efficiencies that accelerate the search process and increase patient access are key to preparedness in a contingency ...Transplantation: Create a platform that facilitates multicenter collaboration and data management.
Runaways in Juvenile Courts. OJJDP Update on Statistics.
ERIC Educational Resources Information Center
Sickmund, Melissa
The National Center for Juvenile Justice (NCJJ) analyzed records in the Center's National Juvenile Court Data Archive to examine how the juvenile courts handled runaway cases. NCJJ examined 40,000 records of runaway cases processed between 1985 and 1986 in 611 jurisdictions from 12 states representing about one-quarter of the U.S. youth population…
Pre-Application Meeting for TCGA Expansion RFA - TCGA
The National Cancer Institute (NCI) hosted a pre-application meeting for TCGA funding of Genome Characterization Centers and Genome Data Analysis Centers. At this pre-application meeting, NCI staff presented on the goals and objectives for the TCGA Research Network, discussed the Request for Applications (RFA) peer review process and answered questions.
2003-10-27
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, Bill Prosser (left) and Eric Madaras, NASA-Langley Research Center, conduct impulse tests on the right wing leading edge (WLE) of Space Shuttle Endeavour. The tests monitor how sound impulses propagate through the WLE area. The data collected will be analyzed to explore the possibility of adding new instrumentation to the wing that could automatically detect debris or micrometeroid impacts on the Shuttle while in flight. The study is part of the initiative ongoing at KSC and around the agency to return the orbiter fleet to flight status.
Data collection and evaluation for experimental computer science research
NASA Technical Reports Server (NTRS)
Zelkowitz, Marvin V.
1983-01-01
The Software Engineering Laboratory was monitoring software development at NASA Goddard Space Flight Center since 1976. The data collection activities of the Laboratory and some of the difficulties of obtaining reliable data are described. In addition, the application of this data collection process to a current prototyping experiment is reviewed.
Cancer Reporting: Timeliness Analysis and Process Reengineering
ERIC Educational Resources Information Center
Jabour, Abdulrahman M.
2016-01-01
Introduction: Cancer registries collect tumor-related data to monitor incident rates and support population-based research. A common concern with using population-based registry data for research is reporting timeliness. Data timeliness have been recognized as an important data characteristic by both the Centers for Disease Control and Prevention…
FJET Database Project: Extract, Transform, and Load
NASA Technical Reports Server (NTRS)
Samms, Kevin O.
2015-01-01
The Data Mining & Knowledge Management team at Kennedy Space Center is providing data management services to the Frangible Joint Empirical Test (FJET) project at Langley Research Center (LARC). FJET is a project under the NASA Engineering and Safety Center (NESC). The purpose of FJET is to conduct an assessment of mild detonating fuse (MDF) frangible joints (FJs) for human spacecraft separation tasks in support of the NASA Commercial Crew Program. The Data Mining & Knowledge Management team has been tasked with creating and managing a database for the efficient storage and retrieval of FJET test data. This paper details the Extract, Transform, and Load (ETL) process as it is related to gathering FJET test data into a Microsoft SQL relational database, and making that data available to the data users. Lessons learned, procedures implemented, and programming code samples are discussed to help detail the learning experienced as the Data Mining & Knowledge Management team adapted to changing requirements and new technology while maintaining flexibility of design in various aspects of the data management project.
Research studies on advanced optical module/head designs for optical devices
NASA Technical Reports Server (NTRS)
Burke, James J.
1991-01-01
A summary is presented of research in optical data storage materials and of research at the center. The first section contains summary reports under the general headings of: (1) Magnetooptic media: modeling, design, fabrication, characterization, and testing; (2) Optical heads: holographic optical elements; and (3) Optical heads: integrated optics. The second section consist of a proposal entitled, Signal Processing Techniques for Optical Data Storage. And section three presents various publications prepared by the center.
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.
2017-12-01
This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.
Kalvelage, T.; Willems, Jennifer
2003-01-01
The design of the EOS Data and Information Systems (EOSDIS) to acquire, archive, manage and distribute Earth observation data to the broadest possible user community was discussed. A number of several integrated retrieval, processing and distribution capabilities have been explained. The value of these functions to the users were described and potential future improvements were laid out for the users. The users were interested in acquiring the retrieval, processing and archiving systems integrated so that they can get the data they want in the format and delivery mechanism of their choice.
Accessing technical data bases using STDS: A collection of scenarios
NASA Technical Reports Server (NTRS)
Hardgrave, W. T.
1975-01-01
A line by line description is given of sessions using the set-theoretic data system (STDS) to interact with technical data bases. The data bases contain data from actual applications at NASA Langley Research Center. The report is meant to be a tutorial document that accompanies set processing in a network environment.
Measurements of the center-of-mass energies at BESIII via the di-muon process
NASA Astrophysics Data System (ADS)
Ablikim, M.; N. Achasov, M.; C. Ai, X.; Albayrak, O.; Albrecht, M.; J. Ambrose, D.; Amoroso, A.; An, F. F.; An, Q.; Bai, J. Z.; Baldini, Ferroli R.; Ban, Y.; Bennett, D. W.; Bennett, J. V.; Bertani, M.; Bettoni, D.; Bian, J. M.; Bianchi, F.; Boger, E.; Boyko, I.; Briere, R. A.; Cai, H.; Cai, X.; Cakir, O.; Calcaterra, A.; Cao, G. F.; Cetin, S. A.; Chang, J. F.; Chelkov, G.; Chen, G.; Chen, H. S.; Chen, H. Y.; Chen, J. C.; Chen, M. L.; Chen, S. J.; Chen, X.; Chen, X. R.; Chen, Y. B.; Cheng, H. P.; Chu, X. K.; Cibinetto, G.; Dai, H. L.; Dai, J. P.; Dbeyssi, A.; Dedovich, D.; Y. Deng, Z.; Denig, A.; Denysenko, I.; Destefanis, M.; De Mori, F.; Ding, Y.; Dong, C.; Dong, J.; Dong, L. Y.; Dong, M. Y.; Du, S. X.; Duan, P. F.; Fan, J. Z.; Fang, J.; Fang, S. S.; Fang, X.; Fang, Y.; Fava, L.; Feldbauer, F.; Felici, G.; Feng, C. Q.; Fioravanti, E.; Fritsch, M.; Fu, C. D.; Gao, Q.; Gao, X. L.; Gao, X. Y.; Gao, Y.; Gao, Z.; Garzia, I.; Goetzen, K.; Gong, W. X.; Gradl, W.; Greco, M.; Gu, M. H.; Gu, Y. T.; Guan, Y. H.; Guo, A. Q.; Guo, L. B.; Guo, Y.; Guo, Y. P.; Haddadi, Z.; Hafner, A.; Han, S.; Q. Hao, X. Q.; Harris, F. A.; He, K. L.; Held, T.; Heng, Y. K.; Hou, Z. L.; Hu, C.; Hu, H. M.; Hu, J. F.; Hu, T.; Hu, Y.; Huang, G. M.; Huang, G. S.; Huang, J. S.; Huang, X. T.; Huang Y.; Hussain, T.; Ji, Q.; Ji, Q. P.; Ji, X. B.; Ji, X. L.; Jiang, L. W.; Jiang, X. S.; Jiang, X. Y.; Jiao, J. B.; Jiao, Z.; Jin, D. P.; Jin, S.; Johansson, T.; Julin, A.; Kalantar-Nayestanaki, N.; Kang, X. L.; Kang, X. S.; Kavatsyuk, M.; Ke, B. C.; Kiese, P.; Kliemt, R.; Kloss, B.; Kolcu, O. B.; Kopf, B.; Kornicer, M.; Kühn, W.; Kupsc, A.; Lange, J. S.; Lara, M.; Larin, P.; Leng, C.; Li, C.; Cheng, Li; Li, D. M.; Li, F.; Li, F. Y.; Li, G.; Li, H. B.; Li, J. C.; Li, Jin; Li, K.; Li, K.; Li, Lei; Li, P. R.; Li, T.; Li, W. D.; Li, W. G.; Li, X. L.; Li, X. M.; Li, X. N.; Li, X. Q.; Li, Z. B.; Liang, H.; Liang, Y. F.; Liang, Y. T.; Liao, G. R.; Lin, X.; Liu, B. J.; Liu, C. X.; Liu, D.; Liu, F. H.; Fang, Liu; Feng, Liu; Liu, H. B.; Liu, H. H.; Liu, H. H.; Liu, H. M.; Liu, J.; Liu, J. B.; Liu, J. P.; Liu, J. Y.; Liu, K.; Liu, K. Y.; Liu, L. D.; Liu, P. L.; Liu, Q.; Liu, S. B.; Liu, X.; Liu, Y. B.; Liu, Z. A.; Liu, Zhiqing; Loehner, H.; Lou, X. C.; Lu, H. J; Lu, J. G.; Lu, Y.; Lu, Y. P.; Luo, C. L.; Luo, M. X.; Luo, T.; Luo, X. L.; Lyu, X. R.; Ma, F. C.; Ma, H. L.; Ma, L. L.; Ma, Q. M.; Ma, T.; Ma, X. N.; Ma, X. Y.; Maas, F. E.; Maggiora, M.; Mao, Y. Y.; Mao, Z. P.; Marcello, S.; Messchendorp, J. G.; Min, J.; Mitchell, R. E.; Mo, X. H.; Mo, Y. J.; Morales Morales, C.; Moriya, K.; Muchnoi, N. Yu.; Muramatsu, H.; Nefedov, Y.; Nerling, F.; Nikolaev, I. B.; Ning, Z.; Nisar, S.; Niu, S. L.; Niu, X. Y.; Olsen, S. L.; Ouyang, Q.; Pacetti, S.; Pan, Y.; Patteri, P.; Pelizaeus, M.; Peng, H. P.; Peters, K.; Pettersson, J.; Ping, J. L.; Ping, R. G.; Poling, R.; Prasad, V.; Qi, M.; Qian, S.; Qiao, C. F.; Qin, L. Q.; Qin, N.; Qin, X. S.; Qin, Z. H.; Qiu, J. F.; Rashid, K. H.; Redmer, C. F.; Ripka, M.; Rong, G.; Rosner, Ch.; Ruan, X. D.; Santoro, V.; Sarantsev, A. A.; Savrié, M.; Schoenning, B. K.; Schumann, S.; Shan, W.; Shao, M.; Shen, C. P.; Shen, P. X.; Shen, X. Y.; Sheng, H. Y.; Song, W. M.; Song, X. Y.; Sosio, S.; Spataro, S.; Sun, G. X.; Sun, J. F.; Sun, S. S.; Sun, Y. J.; Sun, Y. Z.; Sun, Z. J.; Sun, Z. T.; Tang, C. J.; Tang, X.; Tapan, I.; Thorndike, E. H.; Tiemens, M.; Ullrich, M.; Uman, I.; Varner, G. S.; Wang, B.; Wang, D.; Wang, D. Y.; Wang, K.; Wang, L. L.; Wang, L. S.; Wang, M.; Wang, P.; Wang, P. L.; Wang, S. G.; Wang, W.; Wang, W. P.; Wang, X. F.; Wang, Y. D.; Wang, Y. F.; Wang, Y. Q.; Wang, Z.; Wang, Z. G.; Wang, Z. H.; Wang, Z. Y.; Weber, T.; Wei, D. H.; Wei, J. B.; Weidenkaff, P.; Wen, S. P.; Wiedner, U.; Wolke, M.; Wu, L. H.; Wu, Z.; Xia, L.; Xia, L. G.; Xia, Y.; Xiao, D.; Xiao, H.; Xiao, Z. J.; Xie, Y. G.; Xiu, Q. L.; Xu, G. F.; Xu, L.; Xu, Q. J.; Xu, X. P.; Yan, L.; Yan, W. B.; Yan, W. C.; Yan, Y. H.; Yang, H. J.; Yang, H. X.; Yang, L.; Yang, Y.; Yang, Y. X.; Ye, M.; Ye, M. H.; Yin, J. H.; Yu, B. X.; Yu, C. X.; Yu, J. S.; Yuan, C. Z.; Yuan, W. L.; Yuan, Y.; Yuncu, A.; Zafar, A. A.; Zallo, A.; Zeng, A. Y.; Zeng, Z.; Zhang, B. X.; Zhang, B. Y.; Zhang, C.; Zhang, C. C.; Zhang, D. H.; Zhang, H. H.; Zhang, H. Y.; Zhang, J. J.; Zhang, J. L.; Zhang, J. Q.; Zhang, J. W.; Zhang, J. Y.; Zhang, J. Z.; Zhang, K.; Zhang, L.; Zhang, X. Y.; Zhang, Y.; Zhang, Y. N.; Zhang, Y. H.; Zhang, Y. T.; Zhang, Yu; Zhang, Z. H.; Zhang, Z. P.; Zhang, Z. Y.; Zhao, G.; Zhao, J. W.; Zhao, J. Y.; Zhao, J. Z.; Zhao, Lei; Zhao, Ling; Zhao, M. G.; Zhao, Q.; Zhao, Q. W.; Zhao, S. J.; Zhao, T. C.; Zhao, Y. B.; Zhao, Z. G.; Zhemchugov, A.; Zheng, B.; Zheng, J. P.; Zheng, W. J.; Zheng, Y. H.; Zhong, B.; Zhou, L.; Zhou, X.; Zhou, X. K.; Zhou, X. R.; Zhou, X. Y.; Zhu, K.; Zhu, K. J.; Zhu, S.; , S. H.; Zhu, X. L.; Zhu, Y. C.; Zhu, Y. S.; Zhu, Z. A.; Zhuang, J.; Zotti, L.; Zou, B. S.; Zou, J. H.; BESIII Collaboration
2016-06-01
From 2011 to 2014, the BESIII experiment collected about 5 fb-1 data at center-of-mass energies around 4 GeV for the studies of the charmonium-like and higher excited charmonium states. By analyzing the di-muon process e+e- → γISR/FSRμ+μ-, the center-of-mass energies of the data samples are measured with a precision of 0.8 MeV. The center-of-mass energy is found to be stable for most of the time during data taking. Supported by National Key Basic Research Program of China (2015CB856700), National Natural Science Foundation of China (11125525, 11235011, 11322544, 11335008, 11425524, Y61137005C), Chinese Academy of Sciences (CAS) Large-Scale Scientific Facility Program, CAS Center for Excellence in Particle Physics (CCEPP), Collaborative Innovation Center for Particles and Interactions (CICPI), Joint Large-Scale Scientific Facility Funds of NSFC and CAS (11179007, U1232201, U1332201), CAS (KJCX2-YW-N29, KJCX2-YW-N45), 100 Talents Program of CAS, National 1000 Talents Program of China, INPAC and Shanghai Key Laboratory for Particle Physics and Cosmology, German Research Foundation DFG (Collaborative Research Center CRC-1044), Istituto Nazionale di Fisica Nucleare, Italy, Ministry of Development of Turkey (DPT2006K-120470), Russian Foundation for Basic Research (14-07-91152), Swedish Research Council, U. S. Department of Energy (DE-FG02-04ER41291, DE-FG02-05ER41374, DE-FG02-94ER40823, DESC0010118), U.S. National Science Foundation, University of Groningen (RuG) and Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Darmstadt, WCU Program of National Research Foundation of Korea (R32-2008-000-10155-0).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voyles, Jimmy
Individual datastreams from instrumentation at the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility fixed and mobile research observatories (sites) are collected and routed to the ARM Data Center (ADC). The Data Management Facility (DMF), a component of the ADC, executes datastream processing in near-real time. Processed data are then delivered approximately daily to the ARM Data Archive, also a component of the ADC, where they are made freely available to the research community. For each instrument, ARM calculates the ratio of the actual number of processed data records received daily at the ARM Data Archivemore » to the expected number of data records. DOE requires national user facilities to report time-based operating data.« less
ACES MWL data analysis center at SYRTE
NASA Astrophysics Data System (ADS)
Meynadier, F.; Delva, P.; le Poncin-Lafitte, C.; Guerlin, C.; Laurent, P.; Wolf, P.
2017-12-01
The ACES-PHARAO mission aims at operating a cold-atom caesium clock on board the International Space Station, and performs two-way time transfer with ground terminals, in order to allow highly accurate and stable comparisons of its internal timescale with those found in various metrology institutes. Scientific goals in fundamental physics include tests of the gravitational redshift with unprecedented accuracy, and search for a violation of the Lorentz local invariance. As launch is coming closer we are getting ready to process the data expected to come from ACES Microwave Link (MWL) once on board the International Space Station. Several hurdles have been cleared in our software in the past months, as we managed to implement algorithms that reach target accuracy for ground/space desynchronisation measurement. I will present the current status of data analysis preparation, as well as the activities that will take place at SYRTE in order to set up its data processing center.
[An expert system of aiding decision making in breast pathology connected to a clinical data base].
Brunet, M; Durrleman, S; Ferber, J; Ganascia, J G; Hacene, K; Hirt, F; Jouniaux, F; Meeus, L
1987-01-01
The René Huguenin Cancer Center holds a medical file for each patient which is intended to store and process medical data. Since 1970, we introduced computerization: a development plan was elaborated and simultaneously a statistical software (Clotilde--GSI/CFRO) was selected. Thus, we now have access to a large database, structured according to medical rationale, and utilizable with methods of artificial intelligence towards three objectives: improved data acquisition, decision making and exploitation. The first application was to breast pathology, which represents one of the Center's primary activities. The structure of the data concerning patients is by all criteria part of the medical knowledge. This information needs to be presented as well as processed with a suitable language. To this end, we chose a language-oriented object, Mering II, usable with Apple and IBM 4 micro-computers. This project has already allowed to work out an operational model.
NASA Technical Reports Server (NTRS)
Miller, David N.
1989-01-01
The NASA Johnson Space Center's new Multiprogram Control Center (MPCC) addresses the control requirements of complex STS payloads as well as unmanned vehicles. An account is given of the relationship of the MPCC to the STS Mission Control Center, with a view to significant difficulties that may be encountered and solutions thus far devised for generic problems. Examples of MPCC workstation applications encompass telemetry decommutation, engineering unit conversion, data-base management, trajectory processing, and flight design.
A Comprehensive Computer Package for Ambulatory Surgical Facilities
Kessler, Robert R.
1980-01-01
Ambulatory surgical centers are a cost effective alternative to hospital surgery. Their increasing popularity has contributed to heavy case loads, an accumulation of vast amounts of medical and financial data and economic pressures to maintain a tight control over “cash flow”. Computerization is now a necessity to aid ambulatory surgical centers to maintain their competitive edge. An on-line system is especially necessary as it allows interactive scheduling of surgical cases, immediate access to financial data and rapid gathering of medical and statistical information. This paper describes the significant features of the computer package in use at the Salt Lake Surgical Center, which processes 500 cases per month.
Virtual Network Configuration Management System for Data Center Operations and Management
NASA Astrophysics Data System (ADS)
Okita, Hideki; Yoshizawa, Masahiro; Uehara, Keitaro; Mizuno, Kazuhiko; Tarui, Toshiaki; Naono, Ken
Virtualization technologies are widely deployed in data centers to improve system utilization. However, they increase the workload for operators, who have to manage the structure of virtual networks in data centers. A virtual-network management system which automates the integration of the configurations of the virtual networks is provided. The proposed system collects the configurations from server virtualization platforms and VLAN-supported switches, and integrates these configurations according to a newly developed XML-based management information model for virtual-network configurations. Preliminary evaluations show that the proposed system helps operators by reducing the time to acquire the configurations from devices and correct the inconsistency of operators' configuration management database by about 40 percent. Further, they also show that the proposed system has excellent scalability; the system takes less than 20 minutes to acquire the virtual-network configurations from a large scale network that includes 300 virtual machines. These results imply that the proposed system is effective for improving the configuration management process for virtual networks in data centers.
User-centered design and the development of patient decision aids: protocol for a systematic review.
Witteman, Holly O; Dansokho, Selma Chipenda; Colquhoun, Heather; Coulter, Angela; Dugas, Michèle; Fagerlin, Angela; Giguere, Anik Mc; Glouberman, Sholom; Haslett, Lynne; Hoffman, Aubri; Ivers, Noah; Légaré, France; Légaré, Jean; Levin, Carrie; Lopez, Karli; Montori, Victor M; Provencher, Thierry; Renaud, Jean-Sébastien; Sparling, Kerri; Stacey, Dawn; Vaisson, Gratianne; Volk, Robert J; Witteman, William
2015-01-26
Providing patient-centered care requires that patients partner in their personal health-care decisions to the full extent desired. Patient decision aids facilitate processes of shared decision-making between patients and their clinicians by presenting relevant scientific information in balanced, understandable ways, helping clarify patients' goals, and guiding decision-making processes. Although international standards stipulate that patients and clinicians should be involved in decision aid development, little is known about how such involvement currently occurs, let alone best practices. This systematic review consisting of three interlinked subreviews seeks to describe current practices of user involvement in the development of patient decision aids, compare these to practices of user-centered design, and identify promising strategies. A research team that includes patient and clinician representatives, decision aid developers, and systematic review method experts will guide this review according to the Cochrane Handbook and PRISMA reporting guidelines. A medical librarian will hand search key references and use a peer-reviewed search strategy to search MEDLINE, EMBASE, PubMed, Web of Science, the Cochrane Library, the ACM library, IEEE Xplore, and Google Scholar. We will identify articles across all languages and years describing the development or evaluation of a patient decision aid, or the application of user-centered design or human-centered design to tools intended for patient use. Two independent reviewers will assess article eligibility and extract data into a matrix using a structured pilot-tested form based on a conceptual framework of user-centered design. We will synthesize evidence to describe how research teams have included users in their development process and compare these practices to user-centered design methods. If data permit, we will develop a measure of the user-centeredness of development processes and identify practices that are likely to be optimal. This systematic review will provide evidence of current practices to inform approaches for involving patients and other stakeholders in the development of patient decision aids. We anticipate that the results will help move towards the establishment of best practices for the development of patient-centered tools and, in turn, help improve the experiences of people who face difficult health decisions. PROSPERO CRD42014013241.
Redesign of occupational health service operations--strategic planning and evaluation.
Tobias, Beverley; Burnes-Line, Bernadette; Pellarin, Margaret
2008-10-01
This article describes the strategic planning process used by a major academic medical center to redesign the employee health service. The steps in the process are discussed and data demonstrating the success of the program redesign are presented.
Alternative Fuels Data Center: Ethanol Production
States is produced from starch-based crops by dry- or wet-mill processing. Nearly 90% of ethanol plants are dry mills due to lower capital costs. Dry-milling is a process that grinds corn into flour and
A Study of Flood Evacuation Center Using GIS and Remote Sensing Technique
NASA Astrophysics Data System (ADS)
Mustaffa, A. A.; Rosli, M. F.; Abustan, M. S.; Adib, R.; Rosli, M. I.; Masiri, K.; Saifullizan, B.
2016-07-01
This research demonstrated the use of Remote Sensing technique and GIS to determine the suitability of an evacuation center. This study was conducted in Batu Pahat areas that always hit by a series of flood. The data of Digital Elevation Model (DEM) was obtained by ASTER database that has been used to delineate extract contour line and elevation. Landsat 8 image was used for classification purposes such as land use map. Remote Sensing incorporate with GIS techniques was used to determined the suitability location of the evacuation center from contour map of flood affected areas in Batu Pahat. GIS will calculate the elevation of the area and information about the country of the area, the road access and percentage of the affected area. The flood affected area map may provide the suitability of the flood evacuation center during the several levels of flood. The suitability of evacuation centers can be determined based on several criteria and the existing data of the evacuation center will be analysed. From the analysis among 16 evacuation center listed, there are only 8 evacuation center suitable for the usage during emergency situation. The suitability analysis was based on the location and the road access of the evacuation center toward the flood affected area. There are 10 new locations with suitable criteria of evacuation center proposed on the study area to facilitate the process of rescue and evacuating flood victims to much safer and suitable locations. The results of this study will help in decision making processes and indirectly will help organization such as fire-fighter and the Department of Social Welfare in their work. Thus, this study can contribute more towards the society.
JSC earth resources data analysis capabilities available to EOD revision B
NASA Technical Reports Server (NTRS)
1974-01-01
A list and summary description of all Johnson Space Center electronic laboratory and photographic laboratory capabilities available to earth resources division personnel for processing earth resources data are provided. The electronic capabilities pertain to those facilities and systems that use electronic and/or photographic products as output. The photographic capabilities pertain to equipment that uses photographic images as input and electronic and/or table summarizes processing steps. A general hardware description is presented for each of the data processing systems, and the titles of computer programs are used to identify the capabilities and data flow.
Cost Analysis In A Multi-Mission Operations Environment
NASA Technical Reports Server (NTRS)
Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.
2014-01-01
Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature serviceoriented, multi-mission control centers to streamline or refine their cost analysis process.
Cost Analysis in a Multi-Mission Operations Environment
NASA Technical Reports Server (NTRS)
Felton, Larry; Newhouse, Marilyn; Bornas, Nick; Botts, Dennis; Ijames, Gayleen; Montgomery, Patty; Roth, Karl
2014-01-01
Spacecraft control centers have evolved from dedicated, single-mission or single mission-type support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multi-mission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature service-oriented, multi-mission control centers to streamline or refine their cost analysis process.
The TESS science processing operations center
NASA Astrophysics Data System (ADS)
Jenkins, Jon M.; Twicken, Joseph D.; McCauliff, Sean; Campbell, Jennifer; Sanderfer, Dwight; Lung, David; Mansouri-Samani, Masoud; Girouard, Forrest; Tenenbaum, Peter; Klaus, Todd; Smith, Jeffrey C.; Caldwell, Douglas A.; Chacon, A. D.; Henze, Christopher; Heiges, Cory; Latham, David W.; Morgan, Edward; Swade, Daryl; Rinehart, Stephen; Vanderspek, Roland
2016-08-01
The Transiting Exoplanet Survey Satellite (TESS) will conduct a search for Earth's closest cousins starting in early 2018 and is expected to discover 1,000 small planets with Rp < 4 R⊕ and measure the masses of at least 50 of these small worlds. The Science Processing Operations Center (SPOC) is being developed at NASA Ames Research Center based on the Kepler science pipeline and will generate calibrated pixels and light curves on the NASA Advanced Supercomputing Division's Pleiades supercomputer. The SPOC will also search for periodic transit events and generate validation products for the transit-like features in the light curves. All TESS SPOC data products will be archived to the Mikulski Archive for Space Telescopes (MAST).
Hera: High Energy Astronomical Data Analysis via the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Chai, P.; Pence, W.; Snowden, S.
2011-09-01
The HEASARC at NASA Goddard Space Flight Center has developed Hera, a data processing facility for analyzing high energy astronomical data over the internet. Hera provides all the software packages, disk space, and computing resources needed to do general processing of and advanced research on publicly available data from High Energy Astrophysics missions. The data and data products are kept on a server at GSFC and can be downloaded to a user's local machine. This service is provided for free to students, educators, and researchers for educational and research purposes.
NASA Technical Reports Server (NTRS)
Barrett, Charles A.
2003-01-01
The cyclic oxidation test results for some 1000 high temperature commercial and experimental alloys have been collected in an EXCEL database. This database represents over thirty years of research at NASA Glenn Research Center in Cleveland, Ohio. The data is in the form of a series of runs of specific weight change versus time values for a set of samples tested at a given temperature, cycle time, and exposure time. Included on each run is a set of embedded plots of the critical data. The nature of the data is discussed along with analysis of the cyclic oxidation process. In addition examples are given as to how a set of results can be analyzed. The data is assembled on a read-only compact disk which is available on request from Materials Durability Branch, NASA Glenn Research Center, Cleveland, Ohio.
User's Guide for NODC's Data Processing Systems.
ERIC Educational Resources Information Center
Schuyler, Sonja, Comp.
The purpose of this Guide is to help those receiving data and data products from the National Oceanographic Data Center (NODC) to make better use of the material obtained. In addition, it should help data requesters to intelligently formulate inquiries based on a knowledge of the capabilities (and limitations) of the data base. Chapter I of the…
Status report: Data management program algorithm evaluation activity at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.
1977-01-01
An algorithm evaluation activity was initiated to study the problems associated with image processing by assessing the independent and interdependent effects of registration, compression, and classification techniques on LANDSAT data for several discipline applications. The objective of the activity was to make recommendations on selected applicable image processing algorithms in terms of accuracy, cost, and timeliness or to propose alternative ways of processing the data. As a means of accomplishing this objective, an Image Coding Panel was established. The conduct of the algorithm evaluation is described.
Evolution of International Space Station Program Safety Review Processes and Tools
NASA Technical Reports Server (NTRS)
Ratterman, Christian D.; Green, Collin; Guibert, Matt R.; McCracken, Kristle I.; Sang, Anthony C.; Sharpe, Matthew D.; Tollinger, Irene V.
2013-01-01
The International Space Station Program at NASA is constantly seeking to improve the processes and systems that support safe space operations. To that end, the ISS Program decided to upgrade their Safety and Hazard data systems with 3 goals: make safety and hazard data more accessible; better support the interconnection of different types of safety data; and increase the efficiency (and compliance) of safety-related processes. These goals are accomplished by moving data into a web-based structured data system that includes strong process support and supports integration with other information systems. Along with the data systems, ISS is evolving its submission requirements and safety process requirements to support the improved model. In contrast to existing operations (where paper processes and electronic file repositories are used for safety data management) the web-based solution provides the program with dramatically faster access to records, the ability to search for and reference specific data within records, reduced workload for hazard updates and approval, and process support including digital signatures and controlled record workflow. In addition, integration with other key data systems provides assistance with assessments of flight readiness, more efficient review and approval of operational controls and better tracking of international safety certifications. This approach will also provide new opportunities to streamline the sharing of data with ISS international partners while maintaining compliance with applicable laws and respecting restrictions on proprietary data. One goal of this paper is to outline the approach taken by the ISS Progrm to determine requirements for the new system and to devise a practical and efficient implementation strategy. From conception through implementation, ISS and NASA partners utilized a user-centered software development approach focused on user research and iterative design methods. The user-centered approach used on the new ISS hazard system utilized focused user research and iterative design methods employed by the Human Computer Interaction Group at NASA Ames Research Center. Particularly, the approach emphasized the reduction of workload associated with document and data management activities so more resources can be allocated to the operational use of data in problem solving, safety analysis, and recurrence control. The methods and techniques used to understand existing processes and systems, to recognize opportunities for improvement, and to design and review improvements are described with the intent that similar techniques can be employed elsewhere in safety operations. A second goal of this paper is to provide and overview of the web-based data system implemented by ISS. The software selected for the ISS hazard systemMission Assurance System (MAS)is a NASA-customized vairant of the open source software project Bugzilla. The origin and history of MAS as a NASA software project and the rationale for (and advantages of) using open-source software are documented elsewhere (Green, et al., 2009).
Disaster recovery plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, D.E.
The BMS production implementation will be complete by October 1, 1998 and the server environment will be comprised of two types of platforms. The PassPort Supply and the PeopleSoft Financials will reside on LNIX servers and the PeopleSoft Human Resources and Payroll will reside on Microsoft NT servers. Because of the wide scope and the requirements of the COTS products to run in various environments backup and recovery responsibilities are divided between two groups in Technical Operations. The Central Computer Systems Management group provides support for the LTNIX/NT Backup Data Center, and the Network Infrastructure Systems group provides support formore » the NT Application Server Backup outside the Data Center. The disaster recovery process is dependent on a good backup and recovery process. Information and integrated system data for determining the disaster recovery process is identified from the Fluor Daniel Hanford (FDH) Risk Assessment Plan, Contingency Plan, and Backup and Recovery Plan, and Backup Form for HANDI 2000 BMS.« less
NASA Astrophysics Data System (ADS)
Pesaresi, Damiano; Sleeman, Reinoud
2010-05-01
Many medium to big size seismic data centers around the world are facing the same question: which software to use to acquire seismic data in real-time? A home-made or a commercial one? Both choices have pros and cons. The in-house development of software usually requires an increased investment in human resources rather than a financial investment. However, the advantage of fully accomplishing your own needs could be put in danger when the software engineer quits the job! Commercial software offers the advantage of being maintained, but it may require both a considerable financial investment and training. The main seismic software data acquisition suites available nowadays are the public domain SeisComP and EarthWorm packages and the commercial package Antelope. Nanometrics, Guralp and RefTek also provide seismic data acquisition software, but they are mainly intended for single station/network acquisition. Antelope is a software package for real-time acquisition and processing of seismic network data, with its roots in the academic seismological community. The software is developed by Boulder Real Time Technology (BRTT) and commercialized by Kinemetrics. It is used by IRIS affiliates for off-line data processing and it is the main acquisition tool for the USArray program and data centers in Europe like the ORFEUS Data Center, OGS (Italy), ZAMG (Austria), ARSO (Slovenia) and GFU (Czech Republic). SeisComP was originally developed for the GEOFON global network to provide a system for data acquisition, data exchange (SeedLink protocol) and automatic processing. It has evolved into to a widely distributed, networked seismographic system for data acquisition and real-time data exchange over Internet and is supported by ORFEUS as the standard seismic data acquisition tool in Europe. SeisComP3 is the next generation of the software and was developed for the German Indonesian Tsunami Early Warning System (GITEWS). SeisComP is licensed by GFZ (free of charge) and maintained by a private company (GEMPA). EarthWorm was originally developed by United States Geological Survey (USGS) to exchange data with the Canadian seismologists. Its is now used by several institution around the world. It is maintained and developed by a commercial software house, ISTI.
The NOAA Real-Time Solar-Wind (RTSW) System using ACE Data
NASA Astrophysics Data System (ADS)
Zwickl, R. D.; Doggett, K. A.; Sahm, S.; Barrett, W. P.; Grubb, R. N.; Detman, T. R.; Raben, V. J.; Smith, C. W.; Riley, P.; Gold, R. E.; Mewaldt, R. A.; Maruyama, T.
1998-07-01
The Advanced Composition Explorer (ACE) RTSW system is continuously monitoring the solar wind and produces warnings of impending major geomagnetic activity, up to one hour in advance. Warnings and alerts issued by NOAA allow those with systems sensitive to such activity to take preventative action. The RTSW system gathers solar wind and energetic particle data at high time resolution from four ACE instruments (MAG, SWEPAM, EPAM, and SIS), packs the data into a low-rate bit stream, and broadcasts the data continuously. NASA sends real-time data to NOAA each day when downloading science data. With a combination of dedicated ground stations (CRL in Japan and RAL in Great Britain), and time on existing ground tracking networks (NASA's DSN and the USAF's AFSCN), the RTSW system can receive data 24 hours per day throughout the year. The raw data are immediately sent from the ground station to the Space Environment Center in Boulder, Colorado, processed, and then delivered to its Space Weather Operations center where they are used in daily operations; the data are also delivered to the CRL Regional Warning Center at Hiraiso, Japan, to the USAF 55th Space Weather Squadron, and placed on the World Wide Web. The data are downloaded, processed and dispersed within 5 min from the time they leave ACE. The RTSW system also uses the low-energy energetic particles to warn of approaching interplanetary shocks, and to help monitor the flux of high-energy particles that can produce radiation damage in satellite systems.
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.
2017-12-01
Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.
Anatomy of a Security Operations Center
NASA Technical Reports Server (NTRS)
Wang, John
2010-01-01
Many agencies and corporations are either contemplating or in the process of building a cyber Security Operations Center (SOC). Those Agencies that have established SOCs are most likely working on major revisions or enhancements to existing capabilities. As principle developers of the NASA SOC; this Presenters' goals are to provide the GFIRST community with examples of some of the key building blocks of an Agency scale cyber Security Operations Center. This presentation viII include the inputs and outputs, the facilities or shell, as well as the internal components and the processes necessary to maintain the SOC's subsistence - in other words, the anatomy of a SOC. Details to be presented include the SOC architecture and its key components: Tier 1 Call Center, data entry, and incident triage; Tier 2 monitoring, incident handling and tracking; Tier 3 computer forensics, malware analysis, and reverse engineering; Incident Management System; Threat Management System; SOC Portal; Log Aggregation and Security Incident Management (SIM) systems; flow monitoring; IDS; etc. Specific processes and methodologies discussed include Incident States and associated Work Elements; the Incident Management Workflow Process; Cyber Threat Risk Assessment methodology; and Incident Taxonomy. The Evolution of the Cyber Security Operations Center viII be discussed; starting from reactive, to proactive, and finally to proactive. Finally, the resources necessary to establish an Agency scale SOC as well as the lessons learned in the process of standing up a SOC viII be presented.
TESS Data Processing and Quick-look Pipeline
NASA Astrophysics Data System (ADS)
Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office
2018-01-01
We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.
Gold, Rachel; Cottrell, Erika; Bunce, Arwen; Middendorf, Mary; Hollombe, Celine; Cowburn, Stuart; Mahr, Peter; Melgar, Gerardo
2017-01-01
"Social determinants of heath" (SDHs) are nonclinical factors that profoundly affect health. Helping community health centers (CHCs) document patients' SDH data in electronic health records (EHRs) could yield substantial health benefits, but little has been reported about CHCs' development of EHR-based tools for SDH data collection and presentation. We worked with 27 diverse CHC stakeholders to develop strategies for optimizing SDH data collection and presentation in their EHR, and approaches for integrating SDH data collection and the use of those data (eg, through referrals to community resources) into CHC workflows. We iteratively developed a set of EHR-based SDH data collection, summary, and referral tools for CHCs. We describe considerations that arose while developing the tools and present some preliminary lessons learned. Standardizing SDH data collection and presentation in EHRs could lead to improved patient and population health outcomes in CHCs and other care settings. We know of no previous reports of processes used to develop similar tools. This article provides an example of 1 such process. Lessons from our process may be useful to health care organizations interested in using EHRs to collect and act on SDH data. Research is needed to empirically test the generalizability of these lessons. © Copyright 2017 by the American Board of Family Medicine.
National Water Model: Providing the Nation with Actionable Water Intelligence
NASA Astrophysics Data System (ADS)
Aggett, G. R.; Bates, B.
2017-12-01
The National Water Model (NWM) provides national, street-level detail of water movement through time and space. Operating hourly, this flood of information offers enormous benefits in the form of water resource management, natural disaster preparedness, and the protection of life and property. The Geo-Intelligence Division at the NOAA National Water Center supplies forecasters and decision-makers with timely, actionable water intelligence through the processing of billions of NWM data points every hour. These datasets include current streamflow estimates, short and medium range streamflow forecasts, and many other ancillary datasets. The sheer amount of NWM data produced yields a dataset too large to allow for direct human comprehension. As such, it is necessary to undergo model data post-processing, filtering, and data ingestion by visualization web apps that make use of cartographic techniques to bring attention to the areas of highest urgency. This poster illustrates NWM output post-processing and cartographic visualization techniques being developed and employed by the Geo-Intelligence Division at the NOAA National Water Center to provide national actionable water intelligence.
Estimate benefits of crowdsourced data from social media.
DOT National Transportation Integrated Search
2014-12-01
Traffic Management Centers (TMCs) acquire, process, and integrate data in a variety of ways to support real-time operations. Crowdsourcing has been identified as one of the top trends and technologies that traffic management agencies can adapt and ta...
Role and interest of new technologies in data processing for space control centers
NASA Astrophysics Data System (ADS)
Denier, Jean-Paul; Caspar, Raoul; Borillo, Mario; Soubie, Jean-Luc
1990-10-01
The ways in which a multidisplinary approach will improve space control centers is discussed. Electronic documentation, ergonomics of human computer interfaces, natural language, intelligent tutoring systems and artificial intelligence systems are considered and applied in the study of the Hermes flight control center. It is concluded that such technologies are best integrated into a classical operational environment rather than taking a revolutionary approach which would involve a global modification of the system.
Expanding the Role of an Earth Science Data System: The GHRC Innovations Lab
NASA Astrophysics Data System (ADS)
Conover, H.; Ramachandran, R.; Smith, T.; Kulkarni, A.; Maskey, M.; He, M.; Keiser, K.; Graves, S. J.
2013-12-01
The Global Hydrology Resource Center is a NASA Earth Science Distributed Active Archive Center (DAAC), managed in partnership by the Earth Science Department at NASA's Marshall Space Flight Center and the University of Alabama in Huntsville's Information Technology and Systems Center. Established in 1991, the GHRC processes, archives and distributes global lightning data from space, airborne and ground based observations from hurricane science field campaigns and Global Precipitation Mission (GPM) ground validation experiments, and satellite passive microwave products. GHRC's close association with the University provides a path for technology infusion from the research center into the data center. The ITSC has a long history of designing and operating science data and information systems. In addition to the GHRC and related data management projects, the ITSC also conducts multidisciplinary research in many facets of information technology. The coupling of ITSC research with the operational GHRC Data Center has enabled the development of new technologies that directly impact the ability of researchers worldwide to apply Earth science data to their specific domains of interest. The GHRC Innovations Lab will provide a showcase for emerging geoinformatics technologies resulting from NASA-sponsored research at the ITSC. Research products to be deployed in the Innovations Lab include: * Data Albums - curated collections of information related to a specific science topic or event with links to relevant data files from different sources. * Data Prospecting - combines automated data mining techniques with user interaction to provide for quick exploration of large volumes of data. * Provenance Browser - provides for graphical exploration of data lineage and related contextual information. In the Innovations Lab, these technologies can be targeted to GHRC data sets, and tuned to address GHRC user interests. As technologies are tested and matured in the Innovations Lab, the most promising will be selected for incorporation into the GHRC's online tool suite.
Early Citability of Data vs Peer-Review like Data Publishing Procedures
NASA Astrophysics Data System (ADS)
Stockhause, Martina; Höck, Heinke; Toussaint, Frank; Lautenschlager, Michael
2014-05-01
The World Data Center for Climate (WDCC) hosted at the German Climate Computing Center (DKRZ) was one of the first data centers, which established a peer-review like data publication procedure resulting in DataCite DOIs. Data in the long-term archive (LTA) is diligently reviewed by data managers and data authors to grant high quality and widely reusability of the published data. This traditional data publication procedure for LTA data bearing DOIs is very time consuming especially for WDCC's high data volumes of climate model data in the order of multiple TBytes. Data is shared with project members and selected scientists months before the data is long-term archived. The scientific community analyses and thus reviews the data leading to data quality improvements. Scientists wish to cite these unstable data in scientific publications before the long-term archiving and the thorough data review process are finalized. A concept for early preprint DOIs for shared but not yet long-term archived data is presented. Requirements on data documentation, persistence and quality and use cases for preprint DOIs within the data life-cycle are discussed as well as questions of how to document the differences of the two DOI types and how to relate them to each other with the recommendation to use LTA DOIs in citations. WDCC wants to offer an additional user service for early citations of data of basic quality without compromising the LTA DOIs, i.e. WDCC's standard DOIs, as trustworthy indicator for high quality data. Referencing Links: World Data Center for Climate (WDCC): http://www.wdc-climate.de German Climate Computing Center (DKRZ): http://www.dkrz.de DataCite: http://datacite.org
A novel framework for objective detection and tracking of TC center from noisy satellite imagery
NASA Astrophysics Data System (ADS)
Johnson, Bibin; Thomas, Sachin; Rani, J. Sheeba
2018-07-01
This paper proposes a novel framework for automatically determining and tracking the center of a tropical cyclone (TC) during its entire life-cycle from the Thermal infrared (TIR) channel data of the geostationary satellite. The proposed method handles meteorological images with noise, missing or partial information due to the seasonal variability and lack of significant spatial or vortex features. To retrieve the cyclone center from these circumstances, a synergistic approach based on objective measures and Numerical Weather Prediction (NWP) model is being proposed. This method employs a spatial gradient scheme to process missing and noisy frames or a spatio-temporal gradient scheme for image sequences that are continuous and contain less noise. The initial estimate of the TC center from the missing imagery is corrected by exploiting a NWP model based post-processing scheme. The validity of the framework is tested on Infrared images of different cyclones obtained from various Geostationary satellites such as the Meteosat-7, INSAT- 3 D , Kalpana-1 etc. The computed track is compared with the actual track data obtained from Joint Typhoon Warning Center (JTWC), and it shows a reduction of mean track error by 11 % as compared to the other state of the art methods in the presence of missing and noisy frames. The proposed method is also successfully tested for simultaneous retrieval of the TC center from images containing multiple non-overlapping cyclones.
NASA's Long-Term Archive (LTA) of ICESat Data at the National Snow and Ice Data Center (NSIDC)
NASA Astrophysics Data System (ADS)
Fowler, D. K.; Moses, J. F.; Dimarzio, J. P.; Webster, D.
2011-12-01
Data Stewardship, preservation, and reproducibility are becoming principal parts of a data manager's work. In an era of distributed data and information systems, where the host location ought to be transparent to the internet user, it is of vital importance that organizations make a commitment to both current and long-term goals of data management and the preservation of scientific data. NASA's EOS Data and Information System (EOSDIS) is a distributed system of discipline-specific archives and mission-specific science data processing facilities. Satellite missions and instruments go through a lifecycle that involves pre-launch calibration, on-orbit data acquisition and product generation, and final reprocessing. Data products and descriptions flow to the archives for distribution on a regular basis during the active part of the mission. However there is additional information from the product generation and science teams needed to ensure the observations will be useful for long term climate studies. Examples include ancillary input datasets, product generation software, and production history as developed by the team during the course of product generation. These data and information will need to be archived after product data processing is completed. Using inputs from the USGCRP Workshop on Long Term Archive Requirements (1998), discussions with EOS instrument teams, and input from the 2011 ESIPS Federation meeting, NASA is developing a set of Earth science data and information content requirements for long term preservation that will ultimately be used for all the EOS missions as they come to completion. Since the ICESat/GLAS mission is one of the first to come to an end, NASA and NSIDC are preparing for long-term support of the ICESat mission data now. For a long-term archive, it is imperative that there is sufficient information about how products were prepared in order to convince future researchers that the scientific results are accurate, understandable, useable, and reproducible. Our experience suggests data centers know what to preserve in most cases, i.e., the processing algorithms along with the Level 0 or Level 1a input and ancillary products used to create the higher-level products will be archived and made available to users. In other cases the data centers must seek guidance from the science team, e.g., for pre-launch, calibration/validation, and test data. All these data are an important part of product provenance, contributing to and helping establish the integrity of the scientific observations for long term climate studies. In this presentation we will describe application of information content requirements, guidance from the ICESat/GLAS Science Team and the flow of additional information from the ICESat Science team and Science Investigator-Led Processing System to the Distributed Active Archive Center.
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.
1996-01-01
As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.
Harrison Ford Tapes Climate Change Show at Ames (Reporter Package)
2014-04-11
Hollywood legend Harrison Ford made a special visit to NASA's Ames Research Center to shoot an episode for a new documentary series about climate change called 'Years of Living Dangerously.' After being greeted by Center Director Pete Worden, Ford was filmed meeting with NASA climate scientists and discussed global temperature prediction data processed using one of the world's fastest supercomputers at Ames. Later he flew in the co-pilot seat in a jet used to gather data for NASA air quality studies.
NASA Astrophysics Data System (ADS)
Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.
2016-12-01
Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.
NASA Technical Reports Server (NTRS)
1976-01-01
Babcock & Wilcox Co. under a partnership with Marshall Space Flight Center, produced composite materials, originally from the shuttle program, for improving golf clubs. Company used Marshall Space Flight Center's data summary file summarizing typical processing techniques and mechanical and physical properties of graphite and boron- reinforced composite materials. Reinforced composites provide combination of shaft rigidity and flexibility that provide maximum distance.
ERIC Educational Resources Information Center
Zimmerman, Heather Toomey; McClain, Lucy Richardson
2014-01-01
Using a sociocultural framework to approach intergenerational learning, this inquiry examines learning processes used by families during visits to one nature center. Data were collected from videotaped observations of families participating in an environmental education program and a follow-up task to draw the habitat of raptors. Based on a…
ERIC Educational Resources Information Center
Round, Jennifer E.; Campbell, A. Malcolm
2013-01-01
The ability to interpret experimental data is essential to understanding and participating in the process of scientific discovery. Reading primary research articles can be a frustrating experience for undergraduate biology students because they have very little experience interpreting data. To enhance their data interpretation skills, students…
Real Time Data for Seismology at the IRIS Data Management Center, AN Nsf-Sponsored Facility
NASA Astrophysics Data System (ADS)
Benson, R. B.; Ahern, T. K.; Trabant, C.; Weertman, B. R.; Casey, R.; Stromme, S.; Karstens, R.
2012-12-01
When IRIS was incorporated in 1984, it committed to provide long-term support for the science of seismology. It first upgraded analog networks by installing observatory grade digital seismic recording equipment (by constructing the Global Seismic Network to upgrade the World Wide Standardized Seismographic Network) that became the backbone of the International Federation of Digital Seismic Networks (FDSN), and in 1990 constructed a state-of-the-art data center that would allow free and open access to data to everyone. For the first decade, IRIS leveraged a complicated system of telemetry which laid the foundation for delivering (relatively) high rate and continuous seismic time series data to the IRIS Data Management Center, which was designed to accept data that arrived with highly variable latencies and on many media formats. This meant that science had to often wait until data became complete, which at the time was primarily related to studying earthquakes or similar events. During the 1990's, numerous incremental but small improvements were made to get data into the hands of users with less latency, leveraging dialup, satellite telemetry, and a variety of Internet protocols. But beginning in 2000, the IRIS Data Management Center began the process of accumulating data comprehensively in real time. It was first justified because it eliminated the time-consuming transcription and manual data handling on various media formats, like magnetic tapes, CD's and DVD's. However, the switch to real-time telemetry proved to be a major improvement technologically because it not only simplified data transfer, it opened access to a large volume of previously inaccessible data (local resource limitations), and many networks began willingly providing their geophysical data to the broad research community. It also enabled researchers the ability to process data in different and streamlined ways, by incorporating data directly into workflows and processing packages. Any network on the Internet, small or large, can now share data, and today, the IRIS DMC receives nearly all of its seismic data from regional and international networks in real time. We will show that this evolution to managing real time data has provided the framework for accomplishing many important benefits that illustrate that open, real time data should be the goal of every observatory operation and can provide: - Faster (therefore cost and data saving) quality control, - Data products that highlight source properties and provide teachable moments - Data delivery to regional or national networks around the globe for immediate access for monitoring. -Use in teaching the public, providing streaming data to museums, schools, etc.
Test/score/report: Simulation techniques for automating the test process
NASA Technical Reports Server (NTRS)
Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.
1994-01-01
A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.
Quality Measures for Hospice and Palliative Care: Piloting the PEACE Measures
Rokoske, Franziska S.; Durham, Danielle; Cagle, John G.; Hanson, Laura C.
2014-01-01
Abstract Background: The Carolinas Center for Medical Excellence launched the PEACE project in 2006, under contract with the Centers for Medicare & Medicaid Services (CMS), to identify, develop, and pilot test quality measures for hospice and palliative care programs. Objectives: The project collected pilot data to test the usability and feasibility of potential quality measures and data collection processes for hospice and palliative care programs. Settings/subjects: Twenty-two hospices participating in a national Quality Improvement Collaborative (QIC) submitted data from 367 chart reviews for pain care and 45 chart reviews for nausea care. Fourteen additional hospices completed a one-time data submission of 126 chart reviews on 60 potential patient-level quality measures across eight domains of care and an organizational assessment evaluating structure and processes of care. Design: Usability was assessed by examining the range, variability and size of the populations targeted by each quality measure. Feasibility was assessed during the second pilot study by surveying data abstractors about the abstraction process and examining the rates of missing data. The impact of data collection processes was assessed by comparing results obtained using different processes. Results: Measures shown to be both usable and feasible included: screening for physical symptoms on admission and documentation of treatment preferences. Methods of data collection and measure construction appear to influence observed rates of quality of care. Conclusions: We successfully identified quality measures with potential for use in hospices and palliative care programs. Future research is needed to understand whether these measures are sensitive to quality improvement interventions. PMID:24921162
Data systems for science integration within the Atmospheric Radiation Measurement Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gracio, D.K.; Hatfield, L.D.; Yates, K.R.
The Atmospheric Radiation Measurement (ARM) Program was developed by the US Department of Energy to support the goals and mission of the US Global Change Research Program. The purpose of the ARM program is to improve the predictive capabilities of General Circulation Models (GCMs) in their treatment of clouds and radiative transfer effects. Three experimental testbeds were designed for the deployment of instruments to collect atmospheric data used to drive the GCMs. Each site, known as a Cloud and Radiation Testbed (CART), consists of a highly available, redundant data system for the collection of data from a variety of instrumentation.more » The first CART site was deployed in April 1992 in the Southern Great Plains (SGP), Lamont, Oklahoma, with the other two sites to follow in early 1996 in the Tropical Western Pacific (TWP) and in 1997 on the North Slope of Alaska (NSA). Approximately 1.5 GB of data are transferred per day via the Internet from the CART sites, and external data sources to the ARM Experiment Center (EC) at Pacific Northwest Laboratory in Richland, Washington. The Experimental Center is central to the ARM data path and provides for the collection, processing, analysis and delivery of ARM data. Data from the CART sites from a variety of instrumentation, observational systems and from external data sources are transferred to the Experiment Center. The EC processes these data streams on a continuous basis to provide derived data products to the ARM Science Team in near real-time while maintaining a three-month running archive of data.« less
ERIC Educational Resources Information Center
Allan, Blaine W., Comp.
The procedures, forms, and philosophy of the computerized modular scheduling program developed at Virgin Valley High School are outlined. The modular concept is eveloped as a new approach to course structure with explanations, examples, and worksheets included. Examples of courses of study, input information for the data processing center, output…
Artificial intelligence issues related to automated computing operations
NASA Technical Reports Server (NTRS)
Hornfeck, William A.
1989-01-01
Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.
Rehabilitation centers in change: participatory methods for managing redesign and renovation.
Lahtinen, Marjaana; Nenonen, Suvi; Rasila, Heidi; Lehtelä, Jouni; Ruohomäki, Virpi; Reijula, Kari
2014-01-01
The aim of this article is to describe a set of participatory methods that we have either developed or modified for developing future work and service environments to better suit renewable rehabilitation processes. We discuss the methods in a larger framework of change process model and participatory design. Rehabilitation organizations are currently in transition; customer groups, financing, services, and the processes of rehabilitation centers are changing. The pressure for change challenges the centers to develop both their processes and facilities. There is a need for methods that support change management. Four participatory methods were developed: future workshop, change survey, multi-method assessment tool, and participatory design generator cards. They were tested and evaluated in three rehabilitation centers at the different phases of their change process. The developed methods were considered useful in creating a mutual understanding of the change goals between different stakeholders, providing a good picture of the work community's attitudes toward the change, forming an integrated overview of the built and perceived environment, inspiring new solutions, and supporting the management in steering the change process. The change process model described in this article serves as a practical framework that combined the viewpoints of organizational and facility development. However, participatory design continues to face challenges concerning communication between different stakeholders, and further development of the methods and processes is still needed. Intervention studies could provide data on the success factors that enhance the transformations in the rehabilitation sector. Design process, methodology, organizational transformation, planning, renovation.
2017-04-04
NASA Armstrong’s Mission Control Center, or MCC, is where culmination of all data-gathering occurs. Engineers, flight controllers and researchers monitor flights and missions as they are carried out. Data and video run through the MCC and are recorded, displayed and archived. Data is then processed and prepared for post-flight analysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... in excess of the micro-purchase threshold to the Federal Procurement Data Center which collects, processes, and disseminates official statistical data on Federal contracting. Contracting officers insert...) Number, in solicitations they expect will result in contracts in excess of the micro-purchase threshold...
Interhospital transfer handoff practices among US tertiary care centers: A descriptive survey.
Herrigel, Dana J; Carroll, Madeline; Fanning, Christine; Steinberg, Michael B; Parikh, Amay; Usher, Michael
2016-06-01
Interhospital transfer is an understudied area within transitions of care. The process by which hospitals accept and transfer patients is not well described. National trends and best practices are unclear. To describe the demographics of large transfer centers, to identify common handoff practices, and to describe challenges and notable innovations involving the interhospital transfer handoff process. A convenience sample of 32 tertiary care centers in the United States was studied. Respondents were typically transfer center directors surveyed by phone. Data regarding transfer center demographics, handoff communication practices, electronic infrastructure, and data sharing were obtained. The median number of patients transferred each month per receiving institution was 700 (range, 250-2500); on average, 28% of these patients were transferred to an intensive care unit. Transfer protocols and practices varied by institution. Transfer center coordinators typically had a medical background (78%), and critical care-trained registered nurse was the most prevalent (38%). Common practices included: mandatory recorded 3-way physician-to-physician conversation (84%) and mandatory clinical status updates prior to patient arrival (81%). However, the timeline of clinical status updates was variable. Less frequent transfer practices included: electronic medical record (EMR) cross-talk availability and utilization (23%), real-time transfer center documentation accessibility in the EMR (32%), and referring center clinical documentation available prior to transport (29%). A number of innovative strategies to address challenges involving interhospital handoffs are reported. Interhospital transfer practices vary widely amongst tertiary care centers. Practices that lead to improved patient handoffs and reduced medical errors need additional prospective evaluation. Journal of Hospital Medicine 2016;11:413-417. © 2016 Society of Hospital Medicine. © 2016 Society of Hospital Medicine.
The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center
NASA Astrophysics Data System (ADS)
Zeigler, R. A.; Blumenfeld, E. H.; Srinivasan, P.; McCubbin, F. M.; Evans, C. A.
2018-04-01
The Astromaterials Curation Office has recently begun incorporating X-ray CT data into the curation processes for lunar and meteorite samples, and long-term curation of that data and serving it to the public represent significant technical challenges.
Supporting flight data analysis for Space Shuttle Orbiter Experiments at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.
1983-01-01
The Space Shuttle Orbiter Experiments program in responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The Infrared Imagery of Shuttle (IRIS), Catalytic Surface Effects, and Tile Gap Heating experiments sponsored by Ames Research Center are part of this program. The paper describes the software required to process the flight data which support these experiments. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques have provided information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third Shuttle mission.
Supporting flight data analysis for Space Shuttle Orbiter experiments at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.
1983-01-01
The space shuttle orbiter experiments program is responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The infrared imagery of shuttle (IRIS), catalytic surface effects, and tile gap heating experiments sponsored by Ames Research Center are part of this program. The software required to process the flight data which support these experiments is described. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques provide information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third shuttle mission.
Aircraft scanner data availability via the version 0 Information Management System
NASA Technical Reports Server (NTRS)
Mah, G. R.
1995-01-01
As part of the Earth Observing System Data and Information System (EOSDIS) development, NASA and other government agencies have developed an operational prototype of the Information Management System (IMS). The IMS provides access to the data archived at the Distributed Active Archive Centers (DAAC's) that allows users to search through metadata describing the (image) data. Criteria based on sensor name or type, date and time, and geographic location are used to search the archive. Graphical representations of coverage and browse images are available to further refine a user's selection. previously, the EROS Data Center (EDC) DAAC had identified the Advanced SOlid-state Array Spectrometer (ASAS), Airborne Visible and infrared Imaging Spectrometer (AVIRIS), NS-001, and Thermal Infrared Multispectral Scanner (TIMS) as precursor data sets similar to those the DAAC will handle in the Earth Observing System era. Currently, the EDC DAAC staff, in cooperation with NASA, has transcribed TIMS, NS-001, and Thematic Mapper Simulation (TMS) data from Ames Research Center and also TIMS data from Stennis Space Center. During the transcription process, the IMS metadata and browse images were created to populate the inventory at the EDC DAAC. These data sets are now available in the IMS and may be requested from the any of the DAAC's via the IMS.
SeaWiFS Science Algorithm Flow Chart
NASA Technical Reports Server (NTRS)
Darzi, Michael
1998-01-01
This flow chart describes the baseline science algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Data Processing System (SDPS). As such, it includes only processing steps used in the generation of the operational products that are archived by NASA's Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC). It is meant to provide the reader with a basic understanding of the scientific algorithm steps applied to SeaWiFS data. It does not include non-science steps, such as format conversions, and places the greatest emphasis on the geophysical calculations of the level-2 processing. Finally, the flow chart reflects the logic sequences and the conditional tests of the software so that it may be used to evaluate the fidelity of the implementation of the scientific algorithm. In many cases however, the chart may deviate from the details of the software implementation so as to simplify the presentation.
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, technicians monitor equipment during testing of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, elements of the ARES I-X Roll Control System, or RoCS, will undergo testing. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, technicians monitor equipment during testing of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, a technician adjusts equipment during testing of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, a technician (right) adjusts equipment during testing of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, technicians monitor equipment during testing of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, a technician monitors equipment during testing of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, a technician adjusts equipment during testing of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
2008-01-24
KENNEDY SPACE CENTER, FLA. -- In the hypergolic maintenance facility at NASA's Kennedy Space Center, technicians get ready to begin testing elements of the Ares I-X Roll Control System, or RoCS. The RoCS Servicing Simulation Test is to gather data that will be used to help certify the ground support equipment design and validate the servicing requirements and processes. The RoCS is part of the Interstage structure, the lowest axial segment of the Upper Stage Simulator. In an effort to reduce costs and meet the schedule, most of the ground support equipment that will be used for the RoCS servicing is of space shuttle heritage. This high-fidelity servicing simulation will provide confidence that servicing requirements can be met with the heritage system. At the same time, the test will gather process data that will be used to modify or refine the equipment and processes to be used for the actual flight element. Photo credit: NASA/Kim Shiflett
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
Doppler Radar Profiler for Launch Winds at the Kennedy Space Center (Phase 1a)
NASA Technical Reports Server (NTRS)
Murri, Daniel G.
2011-01-01
The NASA Engineering and Safety Center (NESC) received a request from the, NASA Technical Fellow for Flight Mechanics at Langley Research Center (LaRC), to develop a database from multiple Doppler radar wind profiler (DRWP) sources and develop data processing algorithms to construct high temporal resolution DRWP wind profiles for day-of-launch (DOL) vehicle assessment. This document contains the outcome of Phase 1a of the assessment including Findings, Observations, NESC Recommendations, and Lessons Learned.
Monitoring volcanic threats using ASTER satellite data
Duda, K.A.; Wessels, R.; Ramsey, M.; Dehn, J.
2008-01-01
This document summarizes ongoing activities associated with a research project funded by the National Aeronautics and Space Administration (NASA) focusing on volcanic change detection through the use of satellite imagery. This work includes systems development as well as improvements in data analysis methods. Participating organizations include the NASA Land Processes Distributed Active Archive Center (LP DAAC) at the U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS), the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Science Team, the Alaska Volcano Observatory (AVO) at the USGS Alaska Science Center, the Jet Propulsion Laboratory/California Institute of Technology (JPL/CalTech), the University of Pittsburgh, and the University of Alaska Fairbanks. ?? 2007 IEEE.
NASA Technical Reports Server (NTRS)
1993-01-01
MAST is a decision support system to help in the management of dairy herds. Data is collected on dairy herds around the country and processed at regional centers. One center is Cornell University, where Dr. Lawrence Jones and his team developed MAST. The system draws conclusions from the data and summarizes it graphically. CLIPS, which is embedded in MAST, gives the system the ability to make decisions without user interaction. With this technique, dairy managers can identify herd problems quickly, resulting in improved animal health and higher milk quality. CLIPS (C Language Integrated Production System) was developed by NASA's Johnson Space Center. It is a shell for developing expert systems designed to permit research, development and delivery on conventional computers.
The Application of Security Concepts to the Personnel Database for the Indonesian Navy.
1983-09-01
Postgraduate School, lionterey, California, June 1982. Since 1977, the Indonesian Navy Data Center (DISPULAHTAL) has collected and processed pa-sonnel data to...zel dlta Processing in the Indonesian Navy. 4 -a "o ’% ’." 5. ’S 1 1’S~. . . II. THE _IIIT_ IPR2ES1D PERSONSEL DATABASE SYSTEM The present Database...LEVEL *USER PROCESSING :CONCURRENT MULTI USER/LEVEL Ulf, U 3 , U 3 . . . users S. .. ...... secret C. .. ...... classified U .. .. ..... unclassified
Data Integration Tool: Permafrost Data Debugging
NASA Astrophysics Data System (ADS)
Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.
2017-12-01
We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.
NASA Technical Reports Server (NTRS)
2008-01-01
This audit was initiated in response to a hotline complaint regarding the review, approval, and release of scientific and technical information (STI) at Johnson Space Center. The complainant alleged that Johnson personnel conducting export control reviews of STI were not fully qualified to conduct those reviews and that the reviews often did not occur until after the STI had been publicly released. NASA guidance requires that STI, defined as the results of basic and applied scientific, technical, and related engineering research and development, undergo certain reviews prior to being released outside of NASA or to audiences that include foreign nationals. The process includes technical, national security, export control, copyright, and trade secret (e.g., proprietary data) reviews. The review process was designed to preclude the inappropriate dissemination of sensitive information while ensuring that NASA complies with a requirement of the National Aeronautics and Space Act of 1958 (the Space Act)1 to provide for the widest practicable and appropriate dissemination of information resulting from NASA research activities. We focused our audit on evaluating the STI review process: specifically, determining whether the roles and responsibilities for the review, approval, and release of STI were adequately defined and documented in NASA and Center-level guidance and whether that guidance was effectively implemented at Goddard Space Flight Center, Johnson Space Center, Langley Research Center, and Marshall Space Flight Center. Johnson was included in the review because it was the source of the initial complaint, and Goddard, Langley, and Marshall were included because those Centers consistently produce significant amounts of STI.
NASA Astrophysics Data System (ADS)
Zachary, Wayne; Eggleston, Robert; Donmoyer, Jason; Schremmer, Serge
2003-09-01
Decision-making is strongly shaped and influenced by the work context in which decisions are embedded. This suggests that decision support needs to be anchored by a model (implicit or explicit) of the work process, in contrast to traditional approaches that anchor decision support to either context free decision models (e.g., utility theory) or to detailed models of the external (e.g., battlespace) environment. An architecture for cognitively-based, work centered decision support called the Work-centered Informediary Layer (WIL) is presented. WIL separates decision support into three overall processes that build and dynamically maintain an explicit context model, use the context model to identify opportunities for decision support and tailor generic decision-support strategies to the current context and offer them to the system-user/decision-maker. The generic decision support strategies include such things as activity/attention aiding, decision process structuring, work performance support (selective, contextual automation), explanation/ elaboration, infosphere data retrieval, and what if/action-projection and visualization. A WIL-based application is a work-centered decision support layer that provides active support without intent inferencing, and that is cognitively based without requiring classical cognitive task analyses. Example WIL applications are detailed and discussed.
ObspyDMT: a Python toolbox for retrieving and processing large seismological data sets
NASA Astrophysics Data System (ADS)
Hosseini, Kasra; Sigloch, Karin
2017-10-01
We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control - routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine) web service of the Data Management Center (DMC) at the Incorporated Research Institutions for Seismology (IRIS).
Internet-based monitoring and benchmarking in ambulatory surgery centers.
Bovbjerg, V E; Olchanski, V; Zimberg, S E; Green, J S; Rossiter, L F
2000-08-01
Each year the number of surgical procedures performed on an outpatient basis increases, yet relatively little is known about assessing and improving quality of care in ambulatory surgery. Conventional methods for evaluating outcomes, which are based on assessment of inpatient services, are inadequate in the rapidly changing, geographically dispersed field of ambulatory surgery. Internet-based systems for improving outcomes and establishing benchmarks may be feasible and timely. Eleven freestanding ambulatory surgery centers (ASCs) reported process and outcome data for 3,966 outpatient surgical procedures to an outcomes monitoring system (OMS), during a demonstration period from April 1997 to April 1999. ASCs downloaded software and protocol manuals from the OMS Web site. Centers securely submitted clinical information on perioperative process and outcome measures and postoperative patient telephone interviews. Feedback to centers ranged from current and historical rates of surgical and postsurgical complications to patient satisfaction and the adequacy of postsurgical pain relief. ASCs were able to successfully implement the data collection protocols and transmit data to the OMS. Data security efforts were successful in preventing the transmission of patient identifiers. Feedback reports to ASCs were used to institute changes in ASC staffing, patient care, and patient education, as well as for accreditation and marketing. The demonstration also pointed out shortcomings in the OMS, such as the need to simplify hardware and software installation as well as data collection and transfer methods, which have been addressed in subsequent OMS versions. Internet-based benchmarking for geographically dispersed outpatient health care facilities, such as ASCs, is feasible and likely to play a major role in this effort.
Passive fire building protection system evaluation (case study: millennium ict centre)
NASA Astrophysics Data System (ADS)
Rahman, Vinky; Stephanie
2018-03-01
Passive fire protection system is a system that refers to the building design, both regarding of architecture and structure. This system usually consists of structural protection that protects the structure of the building and prevents the spread of fire and facilitate the evacuation process in case of fire. Millennium ICT Center is the largest electronic shopping center in Medan, Indonesia. As a public building that accommodates the crowd, this building needs a fire protection system by the standards. Therefore, the purpose of this study is to evaluate passive fire protection system of Millennium ICT Center building. The study was conducted to describe the facts of the building as well as direct observation to the research location. The collected data is then processed using the AHP (Analytical Hierarchy Process) method in its weighting process to obtain the reliability value of passive fire protection fire system. The results showed that there are some components of passive fire protection system in the building, but some are still unqualified. The first section in your paper
Diabetes and Hypertension Quality Measurement in Four Safety-Net Sites
Benkert, R.; Dennehy, P.; White, J.; Hamilton, A.; Tanner, C.
2014-01-01
Summary Background In this new era after the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009, the literature on lessons learned with electronic health record (EHR) implementation needs to be revisited. Objectives Our objective was to describe what implementation of a commercially available EHR with built-in quality query algorithms showed us about our care for diabetes and hypertension populations in four safety net clinics, specifically feasibility of data retrieval, measurements over time, quality of data, and how our teams used this data. Methods A cross-sectional study was conducted from October 2008 to October 2012 in four safety-net clinics located in the Midwest and Western United States. A data warehouse that stores data from across the U.S was utilized for data extraction from patients with diabetes or hypertension diagnoses and at least two office visits per year. Standard quality measures were collected over a period of two to four years. All sites were engaged in a partnership model with the IT staff and a shared learning process to enhance the use of the quality metrics. Results While use of the algorithms was feasible across sites, challenges occurred when attempting to use the query results for research purposes. There was wide variation of both process and outcome results by individual centers. Composite calculations balanced out the differences seen in the individual measures. Despite using consistent quality definitions, the differences across centers had an impact on numerators and denominators. All sites agreed to a partnership model of EHR implementation, and each center utilized the available resources of the partnership for Center-specific quality initiatives. Conclusions Utilizing a shared EHR, a Regional Extension Center-like partnership model, and similar quality query algorithms allowed safety-net clinics to benchmark and improve the quality of care across differing patient populations and health care delivery models. PMID:25298815
Establishing a Secure Data Center with Remote Access: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gonder, J.; Burton, E.; Murakami, E.
2012-04-01
Access to existing travel data is critical for many analysis efforts that lack the time or resources to support detailed data collection. High-resolution data sets provide particular value, but also present a challenge for preserving the anonymity of the original survey participants. To address this dilemma of providing data access while preserving privacy, the National Renewable Energy Laboratory and the U.S. Department of Transportation have launched the Transportation Secure Data Center (TSDC). TSDC data sets include those from regional travel surveys and studies that increasingly use global positioning system devices. Data provided by different collecting agencies varies with respect tomore » formatting, elements included and level of processing conducted in support of the original purpose. The TSDC relies on a number of geospatial and other analysis tools to ensure data quality and to generate useful information outputs. TSDC users can access the processed data in two different ways. The first is by downloading summary results and second-by-second vehicle speed profiles (with latitude/longitude information removed) from a publicly-accessible website. The second method involves applying for a remote connection account to a controlled-access environment where spatial analysis can be conducted, but raw data cannot be removed.« less
Space Weather Services and Products of RWC Russia in 2007
NASA Astrophysics Data System (ADS)
Burov, Viatcheslav; Avdyushin, Sergei; Denisova, Valentina
RWC Russia (Institute of Applied Geophysics) - forecasting center unites activity of the National Heliogeophysic Service of Russia and Regional Warning Center ISES. The Center has been operating since 1974. There are several services that carry out gathering, processing and spreading of the total information data flow, (including both Russian and foreign-exchange data), and forecasts. Forecasting activities results are issued in the form of special messages, the major part of which corresponds to standard codes. Our Web page: www.geospace.ru are represented the current data and the forecasts. At present both a weekly 7-day geomagnetic forecast and the actual disturbance activity information for the previous week are available on the Web page. And, the data of some ionosphere and magnetic stations are available on this page too. Various types of our forecast alert and routine observations are considered in the report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belousov, Yu. M., E-mail: theorphys@phystech.edu
The formation of an ionized acceptor center by a negative muon in crystals with the diamond structure is considered. The negative muon entering a target is captured by a nucleus, forming a muonic atom {sub μ}A coupled to a lattice. The appearing radiation-induced defect has a significant electric dipole moment because of the violation of the local symmetry of the lattice and changes the phonon spectrum of the crystal. The ionized acceptor center is formed owing to the capture of an electron interacting with the electric dipole moment of the defect and with the radiation of a deformation-induced local-mode phonon.more » Upper and lower bounds of the formation rate of the ionized acceptor center in diamond, silicon, and germanium crystals are estimated. It is shown that the kinetics of the formation of the acceptor center should be taken into account when processing μSR experimental data.« less
Cluster analysis for determining distribution center location
NASA Astrophysics Data System (ADS)
Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian
2017-12-01
Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.
The TESS Science Processing Operations Center
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Twicken, Joseph D.; McCauliff, Sean; Campbell, Jennifer; Sanderfer, Dwight; Lung, David; Mansouri-Samani, Masoud; Girouard, Forrest; Tenenbaum, Peter; Klaus, Todd;
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will conduct a search for Earth's closest cousins starting in early 2018 and is expected to discover approximately 1,000 small planets with R(sub p) less than 4 (solar radius) and measure the masses of at least 50 of these small worlds. The Science Processing Operations Center (SPOC) is being developed at NASA Ames Research Center based on the Kepler science pipeline and will generate calibrated pixels and light curves on the NASA Advanced Supercomputing Division's Pleiades supercomputer. The SPOC will also search for periodic transit events and generate validation products for the transit-like features in the light curves. All TESS SPOC data products will be archived to the Mikulski Archive for Space Telescopes (MAST).
Fassbender, Amelie; Rahmioglu, Nilufer; Vitonis, Allison F.; Viganò, Paola; Giudice, Linda C.; D’Hooghe, Thomas M.; Hummelshoj, Lone; Adamson, G. David; Becker, Christian M.; Missmer, Stacey A.; Zondervan, Krina T.; Adamson, G.D.; Allaire, C.; Anchan, R.; Becker, C.M.; Bedaiwy, M.A.; Buck Louis, G.M.; Calhaz-Jorge, C.; Chwalisz, K.; D'Hooghe, T.M.; Fassbender, A.; Faustmann, T.; Fazleabas, A.T.; Flores, I.; Forman, A.; Fraser, I.; Giudice, L.C.; Gotte, M.; Gregersen, P.; Guo, S.-W.; Harada, T.; Hartwell, D.; Horne, A.W.; Hull, M.L.; Hummelshoj, L.; Ibrahim, M.G.; Kiesel, L.; Laufer, M.R.; Machens, K.; Mechsner, S.; Missmer, S.A.; Montgomery, G.W.; Nap, A.; Nyegaard, M.; Osteen, K.G.; Petta, C.A.; Rahmioglu, N.; Renner, S.P.; Riedlinger, J.; Roehrich, S.; Rogers, P.A.; Rombauts, L.; Salumets, A.; Saridogan, E.; Seckin, T.; Stratton, P.; Sharpe-Timms, K.L.; Tworoger, S.; Vigano, P.; Vincent, K.; Vitonis, A.F.; Wienhues-Thelen, U.-H.; Yeung, P.P.; Yong, P.; Zondervan, K.T.
2014-01-01
Objective To harmonize standard operating procedures (SOPs) and standardize the recording of associated data for collection, processing, and storage of human tissues relevant to endometriosis. Design An international collaboration involving 34 clinical/academic centers and three industry collaborators from 16 countries on five continents. Setting In 2013, two workshops were conducted followed by global consultation, bringing together 54 leaders in endometriosis research and sample processing from around the world. Patient(s) None. Intervention(s) Consensus SOPs were based on: 1) systematic comparison of SOPs from 24 global centers collecting tissue samples from women with and without endometriosis on a medium or large scale (publication on >100 cases); 2) literature evidence where available, or consultation with laboratory experts otherwise; and 3) several global consultation rounds. Main Outcome Measure(s) Standard recommended and minimum required SOPs for tissue collection, processing, and storage in endometriosis research. Result(s) We developed “recommended standard” and “minimum required” SOPs for the collection, processing, and storage of ectopic and eutopic endometrium, peritoneum, and myometrium, and a biospecimen data collection form necessary for interpretation of sample-derived results. Conclusion(s) The EPHect SOPs allow endometriosis research centers to decrease variability in tissue-based results, facilitating between-center comparisons and collaborations. The procedures are also relevant to research into other gynecologic conditions involving endometrium, myometrium, and peritoneum. The consensus SOPs are based on the best available evidence; areas with limited evidence are identified as requiring further pilot studies. The SOPs will be reviewed based on investigator feedback and through systematic triannual follow-up. Updated versions will be made available at: http://endometriosisfoundation.org/ephect. PMID:25256928
NASA Technical Reports Server (NTRS)
Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.
2010-01-01
A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.
Observation VLBI Session RAPL02. the Results of the Data Processing
NASA Astrophysics Data System (ADS)
Chuprikov, A. A.
Results of processing of data of a VLBI experiment titled RAPL02 are presented. These observations were made in 2011 February with 5 antennas. All 3 antennas of Petersberg's Institute of Applied Astronomy (IAA) were used in this session. These were antennae in Svetloe, in Zelenchuck, and in Badary. Additionally, a 22-m antenna in Puschino as well as a 32-m antenna in Medicina (Italy) were also included into observations. The raw data correlation was made at the software correlator of Astro Space Center. The secondary data processing was made for 3 quasars, 3C273, 3C279, and 3C286.
Motivational Forces in a Growth-Centered Model of Teacher Evaluation
ERIC Educational Resources Information Center
Bruski, Nicholas Aron
2012-01-01
This paper presents the results of a study that explored the effects of using an action research process to examine and develop a system of teacher evaluation that leads to real changes in teacher behaviors. The study explored motivational forces and psychological processes related to the change process in adult behaviors. Data were collected by…
Looking Back at 25 Years With NASA's EOSDIS Distributed Active Archive Centers
NASA Astrophysics Data System (ADS)
Behnke, J.; Kittel, D.
2017-12-01
NASA's Earth Observing System Data and Information System (EOSDIS) has been a central component of the NASA Earth observation program since the 1990's. The data collected by NASA's remote sensing instruments represent a significant public investment in research. EOSDIS provides free and open access to this data to a worldwide public research community. EOSDIS manages a wide range of Earth science discipline data that include cryosphere, land cover change, polar processes, field campaigns, ocean surface, digital elevation, atmosphere dynamics and composition, and inter-disciplinary research, among many others. From the very beginning, EOSDIS was conceived as a system built on partnerships between NASA Centers, US agencies and academia. As originally conceived, the EOSDIS comprised of organizations to process and disseminate remote sensing and in situ data and provide services to a wide variety of users. These organizations are known as the Distributed Active Archive Centers (DAACs). Because of their active role in NASA mission science and with the science community, the DAACs represent a distinct departure from the run-of-the-mill data center. The purpose of this paper is to highlight this distinction and to describe the experiences, strategies, and lessons learned from the operation of the DAACs. Today, there are 12 DAACs geographically distributed across the US that serve over 3 million users and distributed over 1.5 billion Earth science data products. Managed by NASA's Earth Science Data and Information System (ESDIS) Project at Goddard Space Flight Center, the DAACs each support different Earth science disciplines allowing for the customized support to user communities. The ESDIS Project provides the infrastructure support for the entire EOSDIS system, which has grown to 23 petabytes. The DAACs have improved performance as they have grown over the years, while costs are tightly controlled. We have several recommendations about curation, level of service, automation and return on investment resulting from our 25 years of practice managing the DAACs. By sharing new ideas and innovation in science data management, EOSDIS has been able to evolve to meet demand. However, there are many challenges in the future.
Tracking and data relay satellite system - NASA's new spacecraft data acquisition system
NASA Technical Reports Server (NTRS)
Schneider, W. C.; Garman, A. A.
1979-01-01
This paper describes NASA's new spacecraft acquisition system provided by the Tracking and Data Relay Satellite System (TDRSS). Four satellites in geostationary orbit and a ground terminal will provide complete tracking, telemetry, and command service for all of NASA's orbital satellites below a 12,000 km altitude. Western Union will lease the system, operate the ground terminal and provide operational satellite control. NASA's network control center will be the focal point for scheduling user services and controlling the interface between TDRSS and the NASA communications network, project control centers, and data processing. TDRSS single access user spacecraft data systems will be designed for time shared data relay support, and reimbursement policy and rate structure for non-NASA users are being developed.
The Kepler Data Processing Handbook: A Field Guide to Prospecting for Habitable Worlds
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.
2017-01-01
The Kepler telescope hurtled into orbit in March 2009, initiating NASA's first mission to discover Earth-size planets orbiting Sun-like stars. Kepler simultaneously collected data for approximately 165,000 target stars at a time over its four-year mission, identifying over 4700 planet candidates, over 2300 confirmed or validated planets, and over 2100 eclipsing binaries. While Kepler was designed to discover exoplanets, the long-term, ultrahigh photometric precision measurements it achieved made it a premier observational facility for stellar astrophysics, especially in the field of asteroseismology, and for variable stars, such as RR Lyrae. The Kepler Science Operations Center (SOC) was developed at NASA Ames Research Center to process the data acquired by Kepler from pixel-level calibrations all the way to identifying transiting planet signatures and subjecting them to a suite of diagnostic tests to establish or break confidence in their planetary nature. Detecting small, rocky planets transiting Sun-like stars presents a variety of daunting challenges, including achieving an unprecedented photometric precision of 20 ppm on 6.5-hour timescales, and supporting the science operations, management, processing, and repeated reprocessing of the accumulating data stream. A newly revised and expanded version of the Kepler Data Processing Handbook (KDPH) has been released to support the legacy archival products. The KDPH details the theory, design and performance of the algorithms supporting each data processing step. This paper presents an overview of the KDPH and features illustrations of several key algorithms in the Kepler Science Data Processing Pipeline. Kepler was selected as the 10th mission of the Discovery Program. Funding for this mission is provided by NASA, Science Mission Directorate.
High Energy Astronomical Data Processing and Analysis via the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Snowden, S.; Pence, W.
2012-01-01
The HEASARC at NASA Goddard Space Flight Center and the US XMM-Newton GOF has developed Hera, a data processing facility for analyzing high energy astronomical data over the internet. Hera provides all the disk space and computing resources needed to do general processing of and advanced research on publicly available data from High Energy Astrophysics missions. The data and data products are kept on a server at GSFC and can be downloaded to a user's local machine. Further, the XMM-GOF has developed scripts to streamline XMM data reduction. These are available through Hera, and can also be downloaded to a user's local machine. These are free services provided to students, educators, and researchers for educational and research purposes.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar (CFS) HURRICANE WEATHER RESEARCH and FORECASTING (HWRF) GLOBAL ENSEMBLE FORECAST SYSTEM (GEFS) NATIONAL Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page Author: EMC
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Department of Energy US Bureau of Reclamation Universities University Corporation for Atmospheric Research for Weather and Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Hurricane Weather Research and Forecast System ANALYSIS FORECAST MODEL GSI Gridpoint Statistical Weather and Climate Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page Author
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Observing system Research and Predictability EXperiment (THORPEX) Targeted Obs Targeted Observations Cyclone University Research Court College Park, MD 20740 Page Author: EMC Webmaster Page generated:Sunday, 27-May
New Developments in NOAA's Comprehensive Large Array-Data Stewardship System
NASA Astrophysics Data System (ADS)
Ritchey, N. A.; Morris, J. S.; Carter, D. J.
2012-12-01
The Comprehensive Large Array-data Stewardship System (CLASS) is part of the NOAA strategic goal of Climate Adaptation and Mitigation that gives focus to the building and sustaining of key observational assets and data archives critical to maintaining the global climate record. Since 2002, CLASS has been NOAA's enterprise solution for ingesting, storing and providing access to a host of near real-time remote sensing streams such as the Polar and Geostationary Operational Environmental Satellites (POES and GOES) and the Defense Meteorological Satellite Program (DMSP). Since October, 2011 CLASS has also been the dedicated Archive Data Segment (ADS) of the Suomi National Polar-orbiting Partnership (S-NPP). As the ADS, CLASS receives raw and processed S-NPP records for archival and distribution to the broad user community. Moving beyond just remote sensing and model data, NOAA has endorsed a plan to migrate all archive holdings from NOAA's National Data Centers into CLASS while retiring various disparate legacy data storage systems residing at the National Climatic Data Center (NCDC), National Geophysical Data Center (NGDC) and the National Oceanographic Data Center (NODC). In parallel to this data migration, CLASS is evolving to a service-oriented architecture utilizing cloud technologies for dissemination in addition to clearly defined interfaces that allow better collaboration with partners. This evolution will require implementation of standard access protocols and metadata which will lead to cost effective data and information preservation.
NASA Technical Reports Server (NTRS)
1973-01-01
The applications are reported of new remote sensing techniques for earth resources surveys and environmental monitoring. Applications discussed include: vegetation systems, environmental monitoring, and plant protection. Data processing systems are described.
resources, including agricultural or silvicultural plants; animal fats; residue and waste generated from the production, processing, and marketing of agricultural products, silvicultural products; and other renewable
Information and College Decisions: Evidence from the Texas GO Center Project
ERIC Educational Resources Information Center
Cunha, Jesse M.; Miller, Trey; Weisburst, Emily
2018-01-01
We study GO Centers, a college information program that is run by student peers and provides information about all aspects of the college-going process to academically prepared students on the margin of attending college. We use the semi-random rollout of the program along with detailed panel data on the universe of Texas public school students to…
ERIC Educational Resources Information Center
Rubenstein, Lisa V.; And Others
1996-01-01
A study evaluated the impact of the reorganization of the academic Sepulveda (California) Veterans' Administration medical center toward primary and ambulatory care. Surveys of several thousand patients were linked to computerized utilization and mortality data and related to the center's strategic plan and goals. Substantial improvement in…
2003-10-27
KENNEDY SPACE CENTER, FLA. - In the Orbiter Processing Facility, Eric Madaras (left), NASA-Langley Research Center, and Jim McGee, The Boeing Company, Huntington Beach, Calif., conduct impulse tests on the right wing leading edge (WLE) of Space Shuttle Endeavour. The tests monitor how sound impulses propagate through the WLE area. The data collected will be analyzed to explore the possibility of adding new instrumentation to the wing that could automatically detect debris or micrometeroid impacts on the Shuttle while in flight. The study is part of the initiative ongoing at KSC and around the agency to return the orbiter fleet to flight status.
Calibration development strategies for the Daniel K. Inouye Solar Telescope (DKIST) data center
NASA Astrophysics Data System (ADS)
Watson, Fraser T.; Berukoff, Steven J.; Hays, Tony; Reardon, Kevin; Speiss, Daniel J.; Wiant, Scott
2016-07-01
The Daniel K. Inouye Solar Telescope (DKIST), currently under construction on Haleakalā, in Maui, Hawai'i will be the largest solar telescope in the world and will use adaptive optics to provide the highest resolution view of the Sun to date. It is expected that DKIST data will enable significant and transformative discoveries that will dramatically increase our understanding of the Sun and its effects on the Sun-Earth environment. As a result of this, it is a priority of the DKIST Data Center team at the National Solar Observatory (NSO) to be able to deliver timely and accurately calibrated data to the astronomical community for further analysis. This will require a process which allows the Data Center to develop calibration pipelines for all of the facility instruments, taking advantage of similarities between them, as well as similarities to current generation instruments. There will also be a challenges which are addressed in this article, such as the large volume of data expected, and the importance of supporting both manual and automated calibrations. This paper will detail the current calibration development strategies being used by the Data Center team at the National Solar Observatory to manage this calibration effort, so as to ensure delivery of high quality scientific data routinely to users.
Collection, processing and dissemination of data for the national solar demonstration program
NASA Technical Reports Server (NTRS)
Day, R. E.; Murphy, L. J.; Smok, J. T.
1978-01-01
A national solar data system developed for the DOE by IBM provides for automatic gathering, conversion, transfer, and analysis of demonstration site data. NASA requirements for this system include providing solar site hardware, engineering, data collection, and analysis. The specific tasks include: (1) solar energy system design/integration; (2) developing a site data acquisition subsystem; (3) developing a central data processing system; (4) operating the test facility at Marshall Space Flight Center; (5) collecting and analyzing data. The systematic analysis and evaluation of the data from the National Solar Data System is reflected in a monthly performance report and a solar energy system performance evaluation report.
Study of the Effectiveness of OCR for Decentralized Data Capture and Conversion. Final Report.
ERIC Educational Resources Information Center
Liston, David M.; And Others
The ERIC network conversion to an OCR (Optical Character Recognition) mode of data entry was studied to analyze the potential effectiveness of OCR data entry for future EPC/s (Editorial Processing Centers). Study results are also applicable to any other system involving decentralized bibliographic data capture and conversion functions. The report…
Using CloudSat and the A-Train to Estimate Tropical Cyclone Intensity in the Western North Pacific
2014-09-01
CloudSat System Data Flow (from Cooperative Institute for Research in the Atmosphere 2008...radar Department of Defense Data Processing Center European Centre for Medium-Range Weather Forecasts Earth observing system Earth observing... system data and information system Earth sciences systems pathfinder hierarchical data format moderate resolution imaging spectroradiometer moist
Telemetry downlink interfaces and level-zero processing
NASA Technical Reports Server (NTRS)
Horan, S.; Pfeiffer, J.; Taylor, J.
1991-01-01
The technical areas being investigated are as follows: (1) processing of space to ground data frames; (2) parallel architecture performance studies; and (3) parallel programming techniques. Additionally, the University administrative details and the technical liaison between New Mexico State University and Goddard Space Flight Center are addressed.
42 CFR 410.142 - CMS process for approving national accreditation organizations.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... (5) A description of the organization's data management and analysis system for its accreditation... organizations. 410.142 Section 410.142 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF... Diabetes Self-Management Training and Diabetes Outcome Measurements § 410.142 CMS process for approving...
42 CFR 410.142 - CMS process for approving national accreditation organizations.
Code of Federal Regulations, 2011 CFR
2011-10-01
.... (5) A description of the organization's data management and analysis system for its accreditation... organizations. 410.142 Section 410.142 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF... Diabetes Self-Management Training and Diabetes Outcome Measurements § 410.142 CMS process for approving...
Operational Control Procedures for the Activated Sludge Process: Appendix.
ERIC Educational Resources Information Center
West, Alfred W.
This document is the appendix for a series of documents developed by the National Training and Operational Technology Center describing operational control procedures for the activated sludge process used in wastewater treatment. Categories discussed include: control test data, trend charts, moving averages, semi-logarithmic plots, probability…
Monitoring sodium levels in commercially processed and restaurant foods - dataset and webpages.
USDA-ARS?s Scientific Manuscript database
Nutrient Data Laboratory (NDL), Agriculture Research Service (ARS) in collaboration with Food Surveys Research Group, ARS, and the Centers for Disease Control and Prevention has been monitoring commercially processed and restaurant foods in the United States since 2010. About 125 highly consumed, s...
Materials, Processes, and Environmental Engineering Network
NASA Technical Reports Server (NTRS)
White, Margo M.
1993-01-01
Attention is given to the Materials, Processes, and Environmental Engineering Network (MPEEN), which was developed as a central holding facility for materials testing information generated by the Materials and Processes Laboratory of NASA-Marshall. It contains information from other NASA centers and outside agencies, and also includes the NASA Environmental Information System (NEIS) and Failure Analysis Information System (FAIS) data. The data base is NEIS, which is accessible through MPEEN. Environmental concerns are addressed regarding materials identified by the NASA Operational Environment Team (NOET) to be hazardous to the environment. The data base also contains the usage and performance characteristics of these materials.
Identifying emerging research collaborations and networks: method development.
Dozier, Ann M; Martina, Camille A; O'Dell, Nicole L; Fogg, Thomas T; Lurie, Stephen J; Rubinstein, Eric P; Pearson, Thomas A
2014-03-01
Clinical and translational research is a multidisciplinary, collaborative team process. To evaluate this process, we developed a method to document emerging research networks and collaborations in our medical center to describe their productivity and viability over time. Using an e-mail survey, sent to 1,620 clinical and basic science full- and part-time faculty members, respondents identified their research collaborators. Initial analyses, using Pajek software, assessed the feasibility of using social network analysis (SNA) methods with these data. Nearly 400 respondents identified 1,594 collaborators across 28 medical center departments resulting in 309 networks with 5 or more collaborators. This low-burden approach yielded a rich data set useful for evaluation using SNA to: (a) assess networks at several levels of the organization, including intrapersonal (individuals), interpersonal (social), organizational/institutional leadership (tenure and promotion), and physical/environmental (spatial proximity) and (b) link with other data to assess the evolution of these networks.
Data Access Based on a Guide Map of the Underwater Wireless Sensor Network
Wei, Zhengxian; Song, Min; Yin, Guisheng; Wang, Hongbin; Cheng, Albert M. K.
2017-01-01
Underwater wireless sensor networks (UWSNs) represent an area of increasing research interest, as data storage, discovery, and query of UWSNs are always challenging issues. In this paper, a data access based on a guide map (DAGM) method is proposed for UWSNs. In DAGM, the metadata describes the abstracts of data content and the storage location. The center ring is composed of nodes according to the shortest average data query path in the network in order to store the metadata, and the data guide map organizes, diffuses and synchronizes the metadata in the center ring, providing the most time-saving and energy-efficient data query service for the user. For this method, firstly the data is stored in the UWSN. The storage node is determined, the data is transmitted from the sensor node (data generation source) to the storage node, and the metadata is generated for it. Then, the metadata is sent to the center ring node that is the nearest to the storage node and the data guide map organizes the metadata, diffusing and synchronizing it to the other center ring nodes. Finally, when there is query data in any user node, the data guide map will select a center ring node nearest to the user to process the query sentence, and based on the shortest transmission delay and lowest energy consumption, data transmission routing is generated according to the storage location abstract in the metadata. Hence, specific application data transmission from the storage node to the user is completed. The simulation results demonstrate that DAGM has advantages with respect to data access time and network energy consumption. PMID:29039757
Data Access Based on a Guide Map of the Underwater Wireless Sensor Network.
Wei, Zhengxian; Song, Min; Yin, Guisheng; Song, Houbing; Wang, Hongbin; Ma, Xuefei; Cheng, Albert M K
2017-10-17
Underwater wireless sensor networks (UWSNs) represent an area of increasing research interest, as data storage, discovery, and query of UWSNs are always challenging issues. In this paper, a data access based on a guide map (DAGM) method is proposed for UWSNs. In DAGM, the metadata describes the abstracts of data content and the storage location. The center ring is composed of nodes according to the shortest average data query path in the network in order to store the metadata, and the data guide map organizes, diffuses and synchronizes the metadata in the center ring, providing the most time-saving and energy-efficient data query service for the user. For this method, firstly the data is stored in the UWSN. The storage node is determined, the data is transmitted from the sensor node (data generation source) to the storage node, and the metadata is generated for it. Then, the metadata is sent to the center ring node that is the nearest to the storage node and the data guide map organizes the metadata, diffusing and synchronizing it to the other center ring nodes. Finally, when there is query data in any user node, the data guide map will select a center ring node nearest to the user to process the query sentence, and based on the shortest transmission delay and lowest energy consumption, data transmission routing is generated according to the storage location abstract in the metadata. Hence, specific application data transmission from the storage node to the user is completed. The simulation results demonstrate that DAGM has advantages with respect to data access time and network energy consumption.
Space Radar Image of Mammoth Mountain, California
1999-05-01
This false-color composite radar image of the Mammoth Mountain area in the Sierra Nevada Mountains, California, was acquired by the Spaceborne Imaging Radar-C and X-band Synthetic Aperture Radar aboard the space shuttle Endeavour on its 67th orbit on October 3, 1994. The image is centered at 37.6 degrees north latitude and 119.0 degrees west longitude. The area is about 39 kilometers by 51 kilometers (24 miles by 31 miles). North is toward the bottom, about 45 degrees to the right. In this image, red was created using L-band (horizontally transmitted/vertically received) polarization data; green was created using C-band (horizontally transmitted/vertically received) polarization data; and blue was created using C-band (horizontally transmitted and received) polarization data. Crawley Lake appears dark at the center left of the image, just above or south of Long Valley. The Mammoth Mountain ski area is visible at the top right of the scene. The red areas correspond to forests, the dark blue areas are bare surfaces and the green areas are short vegetation, mainly brush. The purple areas at the higher elevations in the upper part of the scene are discontinuous patches of snow cover from a September 28 storm. New, very thin snow was falling before and during the second space shuttle pass. In parallel with the operational SIR-C data processing, an experimental effort is being conducted to test SAR data processing using the Jet Propulsion Laboratory's massively parallel supercomputing facility, centered around the Cray Research T3D. These experiments will assess the abilities of large supercomputers to produce high throughput Synthetic Aperture Radar processing in preparation for upcoming data-intensive SAR missions. The image released here was produced as part of this experimental effort. http://photojournal.jpl.nasa.gov/catalog/PIA01746
A Fast Implementation of the ISOCLUS Algorithm
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess; Mount, David M.; Netanyahu, Nathan S.; LeMoigne, Jacqueline
2003-01-01
Unsupervised clustering is a fundamental building block in numerous image processing applications. One of the most popular and widely used clustering schemes for remote sensing applications is the ISOCLUS algorithm, which is based on the ISODATA method. The algorithm is given a set of n data points in d-dimensional space, an integer k indicating the initial number of clusters, and a number of additional parameters. The general goal is to compute the coordinates of a set of cluster centers in d-space, such that those centers minimize the mean squared distance from each data point to its nearest center. This clustering algorithm is similar to another well-known clustering method, called k-means. One significant feature of ISOCLUS over k-means is that the actual number of clusters reported might be fewer or more than the number supplied as part of the input. The algorithm uses different heuristics to determine whether to merge lor split clusters. As ISOCLUS can run very slowly, particularly on large data sets, there has been a growing .interest in the remote sensing community in computing it efficiently. We have developed a faster implementation of the ISOCLUS algorithm. Our improvement is based on a recent acceleration to the k-means algorithm of Kanungo, et al. They showed that, by using a kd-tree data structure for storing the data, it is possible to reduce the running time of k-means. We have adapted this method for the ISOCLUS algorithm, and we show that it is possible to achieve essentially the same results as ISOCLUS on large data sets, but with significantly lower running times. This adaptation involves computing a number of cluster statistics that are needed for ISOCLUS but not for k-means. Both the k-means and ISOCLUS algorithms are based on iterative schemes, in which nearest neighbors are calculated until some convergence criterion is satisfied. Each iteration requires that the nearest center for each data point be computed. Naively, this requires O(kn) time, where k denotes the current number of centers. Traditional techniques for accelerating nearest neighbor searching involve storing the k centers in a data structure. However, because of the iterative nature of the algorithm, this data structure would need to be rebuilt with each new iteration. Our approach is to store the data points in a kd-tree data structure. The assignment of points to nearest neighbors is carried out by a filtering process, which successively eliminates centers that can not possibly be the nearest neighbor for a given region of space. This algorithm is significantly faster, because large groups of data points can be assigned to their nearest center in a single operation. Preliminary results on a number of real Landsat datasets show that our revised ISOCLUS-like scheme runs about twice as fast.
The Brazilian Science Data Center (BSDC)
NASA Astrophysics Data System (ADS)
de Almeida, Ulisses Barres; Bodmann, Benno; Giommi, Paolo; Brandt, Carlos H.
Astrophysics and Space Science are becoming increasingly characterised by what is now known as “big data”, the bottlenecks for progress partly shifting from data acquisition to “data mining”. Truth is that the amount and rate of data accumulation in many fields already surpasses the local capabilities for its processing and exploitation, and the efficient conversion of scientific data into knowledge is everywhere a challenge. The result is that, to a large extent, isolated data archives risk being progressively likened to “data graveyards”, where the information stored is not reused for scientific work. Responsible and efficient use of these large data-sets means democratising access and extracting the most science possible from it, which in turn signifies improving data accessibility and integration. Improving data processing capabilities is another important issue specific to researchers and computer scientists of each field. The project presented here wishes to exploit the enormous potential opened up by information technology at our age to advance a model for a science data center in astronomy which aims to expand data accessibility and integration to the largest possible extent and with the greatest efficiency for scientific and educational use. Greater access to data means more people producing and benefiting from information, whereas larger integration of related data from different origins means a greater research potential and increased scientific impact. The project of the BSDC is preoccupied, primarily, with providing tools and solutions for the Brazilian astronomical community. It nevertheless capitalizes on extensive international experience, and is developed in full cooperation with the ASI Science Data Center (ASDC), from the Italian Space Agency, granting it an essential ingredient of internationalisation. The BSDC is Virtual Observatory-complient and part of the “Open Universe”, a global initiative built under the auspices of the United Nations.
Solutions for Mining Distributed Scientific Data
NASA Astrophysics Data System (ADS)
Lynnes, C.; Pham, L.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.
2007-12-01
Researchers at the University of Alabama in Huntsville (UAH) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) are working on approaches and methodologies facilitating the analysis of large amounts of distributed scientific data. Despite the existence of full-featured analysis tools, such as the Algorithm Development and Mining (ADaM) toolkit from UAH, and data repositories, such as the GES DISC, that provide online access to large amounts of data, there remain obstacles to getting the analysis tools and the data together in a workable environment. Does one bring the data to the tools or deploy the tools close to the data? The large size of many current Earth science datasets incurs significant overhead in network transfer for analysis workflows, even with the advanced networking capabilities that are available between many educational and government facilities. The UAH and GES DISC team are developing a capability to define analysis workflows using distributed services and online data resources. We are developing two solutions for this problem that address different analysis scenarios. The first is a Data Center Deployment of the analysis services for large data selections, orchestrated by a remotely defined analysis workflow. The second is a Data Mining Center approach of providing a cohesive analysis solution for smaller subsets of data. The two approaches can be complementary and thus provide flexibility for researchers to exploit the best solution for their data requirements. The Data Center Deployment of the analysis services has been implemented by deploying ADaM web services at the GES DISC so they can access the data directly, without the need of network transfers. Using the Mining Workflow Composer, a user can define an analysis workflow that is then submitted through a Web Services interface to the GES DISC for execution by a processing engine. The workflow definition is composed, maintained and executed at a distributed location, but most of the actual services comprising the workflow are available local to the GES DISC data repository. Additional refinements will ultimately provide a package that is easily implemented and configured at additional data centers for analysis of additional science data sets. Enhancements to the ADaM toolkit allow the staging of distributed data wherever the services are deployed, to support a Data Mining Center that can provide additional computational resources, large storage of output, easier addition and updates to available services, and access to data from multiple repositories. The Data Mining Center case provides researchers more flexibility to quickly try different workflow configurations and refine the process, using smaller amounts of data that may likely be transferred from distributed online repositories. This environment is sufficient for some analyses, but can also be used as an initial sandbox to test and refine a solution before staging the execution at a Data Center Deployment. Detection of airborne dust both over water and land in MODIS imagery using mining services for both solutions will be presented. The dust detection is just one possible example of the mining and analysis capabilities the proposed mining services solutions will provide to the science community. More information about the available services and the current status of this project is available at http://www.itsc.uah.edu/mws/
Challenges in Defining Tsunami Wave Height
NASA Astrophysics Data System (ADS)
Stroker, K. J.; Dunbar, P. K.; Mungov, G.; Sweeney, A.; Arcos, N. P.
2017-12-01
The NOAA National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 Mw earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height, NCEI will consider adding an additional field for the maximum peak measurement.
Challenges in Defining Tsunami Wave Heights
NASA Astrophysics Data System (ADS)
Dunbar, Paula; Mungov, George; Sweeney, Aaron; Stroker, Kelly; Arcos, Nicolas
2017-08-01
The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 M w earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 coastal tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height for each tide gauge and deep-ocean buoy, NCEI will consider adding an additional field for the maximum peak measurement.
Putnam, James E.; Hansen, Cristi V.
2014-01-01
As the Nation’s principle earth-science information agency, the U.S. Geological Survey (USGS) is depended on to collect data of the highest quality. This document is a quality-assurance plan for groundwater activities (GWQAP) of the Kansas Water Science Center. The purpose of this GWQAP is to establish a minimum set of guidelines and practices to be used by the Kansas Water Science Center to ensure quality in groundwater activities. Included within these practices are the assignment of responsibilities for implementing quality-assurance activities in the Kansas Water Science Center and establishment of review procedures needed to ensure the technical quality and reliability of the groundwater products. In addition, this GWQAP is intended to complement quality-assurance plans for surface-water and water-quality activities and similar plans for the Kansas Water Science Center and general project activities throughout the USGS. This document provides the framework for collecting, analyzing, and reporting groundwater data that are quality assured and quality controlled. This GWQAP presents policies directing the collection, processing, analysis, storage, review, and publication of groundwater data. In addition, policies related to organizational responsibilities, training, project planning, and safety are presented. These policies and practices pertain to all groundwater activities conducted by the Kansas Water Science Center, including data-collection programs, interpretive and research projects. This report also includes the data management plan that describes the progression of data management from data collection to archiving and publication.
GHRC: NASAs Hazardous Weather Distributed Active Archive Center
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Bugbee, Kaylin
2016-01-01
The Global Hydrology Resource Center (GHRC; ghrc.nsstc.nasa.gov) is one of NASA's twelve Distributed Active Archive Centers responsible for providing access to NASA's Earth science data to users worldwide. Each of NASA's twelve DAACs focuses on a specific science discipline within Earth science, provides data stewardship services and supports its research community's needs. Established in 1991 as the Marshall Space Flight Center DAAC and renamed GHRC in 1997, the data center's original mission focused on the global hydrologic cycle. However, over the years, data holdings, tools and expertise of GHRC have gradually shifted. In 2014, a User Working Group (UWG) was established to review GHRC capabilities and provide recommendations to make GHRC more responsive to the research community's evolving needs. The UWG recommended an update to the GHRC mission, as well as a strategic plan to move in the new direction. After a careful and detailed analysis of GHRC's capabilities, research community needs and the existing data landscape, a new mission statement for GHRC has been crafted: to provide a comprehensive active archive of both data and knowledge augmentation services with a focus on hazardous weather, its governing dynamical and physical processes, and associated applications. Within this broad mandate, GHRC will focus on lightning, tropical cyclones and storm-induced hazards through integrated collections of satellite, airborne, and in-situ data sets. The new mission was adopted at the recent 2015 UWG meeting. GHRC will retain its current name until such time as it has built substantial data holdings aligned with the new mission.
Future Concepts for Realtime Data Interfaces for Control Centers
NASA Technical Reports Server (NTRS)
Kearney, Mike W., III
2004-01-01
Existing methods of exchanging realtime data between the major control centers in the International Space Station program have resulted in a patchwork of local formats being imposed on each Mission Control Center. This puts the burden on a data customer to comply with the proprietary data formats of each data supplier. This has increased the cost and complexity for each participant, limited access to mission data and hampered the development of efficient and flexible operations concepts. Ideally, a universal format should be promoted in the industry to prevent the unnecessary burden of each center processing a different data format standard for every external interface with another center. With the broad acceptance of XML and other conventions used in other industries, it is now time for the Aerospace industry to fully engage and establish such a standard. This paper will briefly consider the components that would be required by such a standard (XML schema, data dictionaries, etc.) in order to accomplish the goal of a universal low-cost interface, and acquire broad industry acceptance. We will then examine current approaches being developed by standards bodies and other groups. The current state of CCSDS panel work will be reviewed, with a survey of the degree of industry acceptance. Other widely accepted commercial approaches will be considered, sometimes complimentary to the standards work, but sometimes not. The question is whether de facto industry standards are in concert with, or in conflict with the direction of the standards bodies. And given that state of affairs, the author will consider whether a new program establishing its Mission Control Center should implement a data interface based on those standards. The author proposes that broad industry support to unify the various efforts will enable collaboration between control centers and space programs to a wider degree than is currently available. This will reduce the cost for programs to provide realtime access to their data, hence reducing the cost of access to space, and benefiting the industry as a whole.
Recent Cycle Time Reduction at Langley Research Center
NASA Technical Reports Server (NTRS)
Kegelman, Jerome T.
2000-01-01
The NASA Langley Research Center (LaRC) has been engaged in an effort to reduce wind tunnel test cycle time in support of Agency goals and to satisfy the wind tunnel testing needs of the commercial and military aerospace communities. LaRC has established the Wind Tunnel Enterprise (WTE), with goals of reducing wind tunnel test cycle time by an order of magnitude by 2002, and by two orders of magnitude by 2010. The WTE also plans to meet customer expectations for schedule integrity, as well as data accuracy and quality assurance. The WTE has made progress towards these goals over the last year with a focused effort on technological developments balanced by attention to process improvements. This paper presents a summary of several of the WTE activities over the last year that are related to test cycle time reductions at the Center. Reducing wind tunnel test cycle time, defined here as the time between the freezing of loft lines and delivery of test data, requires that the relationship between high productivity and data quality assurance be considered. The efforts have focused on all of the drivers for test cycle time reduction, including process centered improvements, facility upgrades, technological improvements to enhance facility readiness and productivity, as well as advanced measurement techniques. The application of internet tools and computer modeling of facilities to allow a virtual presence of the customer team is also presented.
Spacelab data analysis and interactive control study
NASA Technical Reports Server (NTRS)
Tarbell, T. D.; Drake, J. F.
1980-01-01
The study consisted of two main tasks, a series of interviews of Spacelab users and a survey of data processing and display equipment. Findings from the user interviews on questions of interactive control, downlink data formats, and Spacelab computer software development are presented. Equipment for quick look processing and display of scientific data in the Spacelab Payload Operations Control Center (POCC) was surveyed. Results of this survey effort are discussed in detail, along with recommendations for NASA development of several specific display systems which meet common requirements of many Spacelab experiments.
Integration of medical imaging into a multi-institutional hospital information system structure.
Dayhoff, R E
1995-01-01
The Department of Veterans Affairs (VA) is providing integrated text and image data to its clinical users at its Washington and Baltimore medical centers and, soon, at nine other medical centers. The DHCP Imaging System records clinically significant diagnostic images selected by medical specialists in a variety of departments, including cardiology, gastroenterology, pathology, dermatology, surgery, radiology, podiatry, dentistry, and emergency medicine. These images, which include color and gray scale images, and electrocardiogram waveforms, are displayed on workstations located throughout the medical centers. Integration of clinical images with the VA's electronic mail system allows transfer of data from one medical center to another. The ability to incorporate transmitted text and image data into on-line patient records at the collaborating sites is an important aspect of professional consultation. In order to achieve the maximum benefits from an integrated patient record system, a critical mass of information must be available for clinicians. When there is also seamless support for administration, it becomes possible to re-engineer the processes involved in providing medical care.
NASA Astrophysics Data System (ADS)
Hwang, James Ho-Jin; Duran, Adam
2016-08-01
Most of the times pyrotechnic shock design and test requirements for space systems are provided in Shock Response Spectrum (SRS) without the input time history. Since the SRS does not describe the input or the environment, a decomposition method is used to obtain the source time history. The main objective of this paper is to develop a decomposition method producing input time histories that can satisfy the SRS requirement based on the pyrotechnic shock test data measured from a mechanical impact test apparatus. At the heart of this decomposition method is the statistical representation of the pyrotechnic shock test data measured from the MIT Lincoln Laboratory (LL) designed Universal Pyrotechnic Shock Simulator (UPSS). Each pyrotechnic shock test data measured at the interface of a test unit has been analyzed to produce the temporal peak acceleration, Root Mean Square (RMS) acceleration, and the phase lag at each band center frequency. Maximum SRS of each filtered time history has been calculated to produce a relationship between the input and the response. Two new definitions are proposed as a result. The Peak Ratio (PR) is defined as the ratio between the maximum SRS and the temporal peak acceleration at each band center frequency. The ratio between the maximum SRS and the RMS acceleration is defined as the Energy Ratio (ER) at each band center frequency. Phase lag is estimated based on the time delay between the temporal peak acceleration at each band center frequency and the peak acceleration at the lowest band center frequency. This stochastic process has been applied to more than one hundred pyrotechnic shock test data to produce probabilistic definitions of the PR, ER, and the phase lag. The SRS is decomposed at each band center frequency using damped sinusoids with the PR and the decays obtained by matching the ER of the damped sinusoids to the ER of the test data. The final step in this stochastic SRS decomposition process is the Monte Carlo (MC) simulation. The MC simulation identifies combinations of the PR and decays that can meet the SRS requirement at each band center frequency. Decomposed input time histories are produced by summing the converged damped sinusoids with the MC simulation of the phase lag distribution.
Requirements Specification Document
DOT National Transportation Integrated Search
1996-04-26
The System Definition Document identifies the top level processes, data flows, : and system controls for the Gary-Chicago-Milwaukee (GCM) Corridor Transportation Information Center (C-TIC). This Requirements Specification establishes the requirements...
An Investigation of Data Overload in Team-Based Distributed Cognition Systems
ERIC Educational Resources Information Center
Hellar, David Benjamin
2009-01-01
The modern military command center is a hybrid system of computer automated surveillance and human oriented decision making. In these distributed cognition systems, data overload refers simultaneously to the glut of raw data processed by information technology systems and the dearth of actionable knowledge useful to human decision makers.…
Brown, Roger S; Belton, A Matthew; Martin, Judith M; Simmons, Dee Dee; Taylor, Gloria J; Willard, Ellie
2009-09-01
One of the goals of the Organ Center of the Organ Procurement and Transplantation Network/United Network for Organ Sharing is to increase the efficiency of equitable organ allocation in the United States. Recognizing the ever-growing need for organ donors and transplants, leaders at the Organ Center increased its commitment to quality improvement initiatives through the development of a quality management team in 2001. The Organ Center began to focus on ways to capture data on processes and pinpoint areas for improvement. As the collection and analysis of data evolved, the Organ Center embraced formal quality standards, such as improvement cycles. Using these cycles, the Organ Center has seen significant improvement. One initiative involving lifesaving heart, lung, and liver placement showed success by doubling the Organ Center's organ placement rate. Another project involving the validation of donor information demonstrated that the accuracy of organ allocation can be improved by 5% on a consistent basis. As stewards for the gift of life and leaders in organ allocation, the Organ Center uses continuous quality improvement to achieve the goal of increasing the efficiency of equitable organ allocation.
Validation results of the IAG Dancer project for distributed GPS analysis
NASA Astrophysics Data System (ADS)
Boomkamp, H.
2012-12-01
The number of permanent GPS stations in the world has grown far too large to allow processing of all this data at analysis centers. The majority of these GPS sites do not even make their observation data available to the analysis centers, for various valid reasons. The current ITRF solution is still based on centralized analysis by the IGS, and subsequent densification of the reference frame via regional network solutions. Minor inconsistencies in analysis methods, software systems and data quality imply that this centralized approach is unlikely to ever reach the ambitious accuracy objectives of GGOS. The dependence on published data also makes it clear that a centralized approach will never provide a true global ITRF solution for all GNSS receivers in the world. If the data does not come to the analysis, the only alternative is to bring the analysis to the data. The IAG Dancer project has implemented a distributed GNSS analysis system on the internet in which each receiver can have its own analysis center in the form of a freely distributed JAVA peer-to-peer application. Global parameters for satellite orbits, clocks and polar motion are solved via a distributed least squares solution among all participating receivers. A Dancer instance can run on any computer that has simultaneous access to the receiver data and to the public internet. In the future, such a process may be embedded in the receiver firmware directly. GPS network operators can join the Dancer ITRF realization without having to publish their observation data or estimation products. GPS users can run a Dancer process without contributing to the global solution, to have direct access to the ITRF in near real-time. The Dancer software has been tested on-line since late 2011. A global network of processes has gradually evolved to allow stabilization and tuning of the software in order to reach a fully operational system. This presentation reports on the current performance of the Dancer system, and demonstrates the obvious benefits of distributed analysis of geodetic data in general. IAG Dancer screenshot
Twomey, Michèle; Šijački, Ana; Krummrey, Gert; Welzel, Tyson; Exadaktylos, Aristomenis K; Ercegovac, Marko
2018-03-12
Emergency center visits are mostly unscheduled, undifferentiated, and unpredictable. A standardized triage process is an opportunity to obtain real-time data that paints a picture of the variation in acuity found in emergency centers. This is particularly pertinent as the influx of people seeking asylum or in transit mostly present with emergency care needs or first seek help at an emergency center. Triage not only reduces the risk of missing or losing a patient that may be deteriorating in the waiting room but also enables a time-critical response in the emergency care service provision. As part of a joint emergency care system strengthening and patient safety initiative, the Serbian Ministry of Health in collaboration with the Centre of Excellence in Emergency Medicine (CEEM) introduced a standardized triage process at the Clinical Centre of Serbia (CCS). This paper describes four crucial stages that were considered for the integration of a standardized triage process into acute care pathways.
Global Temperature and Salinity Pilot Project
NASA Technical Reports Server (NTRS)
Searle, Ben
1992-01-01
Data exchange and data management programs have been evolving over many years. Within the international community there are two main programs to support the exchange, management and processing of real time and delayed mode data. The Intergovernmental Oceanographic Commission (IOC) operate the International Oceanographic Data and Information Exchange (IODE) program which coordinates the exchange of delayed mode data between national oceanographic data centers, World Data Centers and the user community. The Integrated Global Ocean Services System is a joint IOC/World Meteorological Organization (WMO) program for the exchange and management of real-time data. These two programs are complemented by mechanisms that have been established within scientific programs to exchange and manage project data sets. In particular TOGA and WOCE have identified a data management requirement and established the appropriate infrastructure to achieve this. Where GTSPP fits into this existing framework is discussed.
The Schema.org Datasets Schema: Experiences at the National Snow and Ice Data Center
NASA Astrophysics Data System (ADS)
Duerr, R.; Billingsley, B. W.; Harper, D.; Kovarik, J.
2014-12-01
Data discovery, is still a major challenge for many users. Relevant data may be located anywhere. There are currently no existing universal data registries. Often users start with a simple query through their web browser. But how do you get your data to actually show up near the top of the results? One relatively new way to accomplish this is to use schema.org dataset markup in your data pages. Theoretically this provides web crawlers the additional information needed so that a query for data will preferentially return those pages that were marked up accordingly. The National Snow and Ice Data Center recently implemented an initial set of markup in the data set pages returned by its catalog. The Datasets data model, our process, challenges encountered and results will be described.
Jayapandian, Catherine P; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D; Zhang, Guo-Qiang; Sahoo, Satya S
2013-01-01
Epilepsy is the most common serious neurological disorder affecting 50-60 million persons worldwide. Multi-modal electrophysiological data, such as electroencephalography (EEG) and electrocardiography (EKG), are central to effective patient care and clinical research in epilepsy. Electrophysiological data is an example of clinical "big data" consisting of more than 100 multi-channel signals with recordings from each patient generating 5-10GB of data. Current approaches to store and analyze signal data using standalone tools, such as Nihon Kohden neurology software, are inadequate to meet the growing volume of data and the need for supporting multi-center collaborative studies with real time and interactive access. We introduce the Cloudwave platform in this paper that features a Web-based intuitive signal analysis interface integrated with a Hadoop-based data processing module implemented on clinical data stored in a "private cloud". Cloudwave has been developed as part of the National Institute of Neurological Disorders and Strokes (NINDS) funded multi-center Prevention and Risk Identification of SUDEP Mortality (PRISM) project. The Cloudwave visualization interface provides real-time rendering of multi-modal signals with "montages" for EEG feature characterization over 2TB of patient data generated at the Case University Hospital Epilepsy Monitoring Unit. Results from performance evaluation of the Cloudwave Hadoop data processing module demonstrate one order of magnitude improvement in performance over 77GB of patient data. (Cloudwave project: http://prism.case.edu/prism/index.php/Cloudwave).
Landsat surface reflectance quality assurance extraction (version 1.7)
Jones, J.W.; Starbuck, M.J.; Jenkerson, Calli B.
2013-01-01
The U.S. Geological Survey (USGS) Land Remote Sensing Program is developing an operational capability to produce Climate Data Records (CDRs) and Essential Climate Variables (ECVs) from the Landsat Archive to support a wide variety of science and resource management activities from regional to global scale. The USGS Earth Resources Observation and Science (EROS) Center is charged with prototyping systems and software to generate these high-level data products. Various USGS Geographic Science Centers are charged with particular ECV algorithm development and (or) selection as well as the evaluation and application demonstration of various USGS CDRs and ECVs. Because it is a foundation for many other ECVs, the first CDR in development is the Landsat Surface Reflectance Product (LSRP). The LSRP incorporates data quality information in a bit-packed structure that is not readily accessible without postprocessing services performed by the user. This document describes two general methods of LSRP quality-data extraction for use in image processing systems. Helpful hints for the installation and use of software originally developed for manipulation of Hierarchical Data Format (HDF) produced through the National Aeronautics and Space Administration (NASA) Earth Observing System are first provided for users who wish to extract quality data into separate HDF files. Next, steps follow to incorporate these extracted data into an image processing system. Finally, an alternative example is illustrated in which the data are extracted within a particular image processing system.
NLSI Focus Group on Recovery of Missing ALSEP Data: Status Update for 2012 NLSI Science Forum
NASA Technical Reports Server (NTRS)
Lewis, Lyach R.; Nakamura, Y.; Nagihara, S.; Williams, D. R.; Chi, P.; Taylor, P. T.; Schmidt, G. K.; Hill, H. K.
2012-01-01
On the six Apollo lunar landed missions, the Astronauts deployed the Apollo Lunar Surface Experiments Package (ALSEP) science stations which measured active and passive seismic events, magnetic fields, charged particles, solar wind, heat flow, the diffuse atmosphere, meteorites and their ejecta, lunar dust, etc. Today s investigators are able to extract new information and make new discoveries from the old ALSEP data utilizing recent advances in computer capabilities and new analysis techniques. However, current-day investigators are encountering problems in trying to use the ALSEP data. The data were in formats often not well described in the published reports and contained rerecording anomalies which required tape experts to resolve. To solve these problems the DPS Lunar Data Node was established at NASA Goddard Space Flight Center (GSFC) NASA Space Science Data Center (NSSDC) in 2008 and is currently in the process of making the existing archived ALSEP data available to current-day investigators in easily useable forms. However, current estimates by NSSDC archivists are that only about 60 percent of the PI processed ALSEP data and less than 30 percent of the raw experiment ALSEP data-of-interest to current lunar science investigators are currently in the NSSDC archives.
Enhanced Product Generation at NASA Data Centers Through Grid Technology
NASA Technical Reports Server (NTRS)
Barkstrom, Bruce R.; Hinke, Thomas H.; Gavali, Shradha; Seufzer, William J.
2003-01-01
This paper describes how grid technology can support the ability of NASA data centers to provide customized data products. A combination of grid technology and commodity processors are proposed to provide the bandwidth necessary to perform customized processing of data, with customized data subsetting providing the initial example. This customized subsetting engine can be used to support a new type of subsetting, called phenomena-based subsetting, where data is subsetted based on its association with some phenomena, such as mesoscale convective systems or hurricanes. This concept is expanded to allow the phenomena to be detected in one type of data, with the subsetting requirements transmitted to the subsetting engine to subset a different type of data. The subsetting requirements are generated by a data mining system and transmitted to the subsetter in the form of an XML feature index that describes the spatial and temporal extent of the phenomena. For this work, a grid-based mining system called the Grid Miner is used to identify the phenomena and generate the feature index. This paper discusses the value of grid technology in facilitating the development of a high performance customized product processing and the coupling of a grid mining system to support phenomena-based subsetting.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Documentation for operational and research users Operational Models All of the secondary bulleted items will be climate MOM4 HYCOM-Wavewatch Modeling Research Global and regional Institutionally supported components
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Cycle) and the High-Resolution Rapid Refresh (HRRR) was developed at NOAA's Earth System Research Prediction (NCWCP) 5830 University Research Court College Park, MD 20740 Page Author: EMC Webmaster Page
NASA Technical Reports Server (NTRS)
Northup, Emily; Benson Early, Amanda; Beach, Aubrey; Wang, Dali; Kusterer, John; Quam, Brandi; Chen, Gao
2015-01-01
The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center is responsible for the ingest, archive, and distribution of NASA Earth Science data in the areas of radiation budget, clouds, aerosols, and tropospheric chemistry. The ASDC specializes in atmospheric data that is important to understanding the causes and processes of global climate change and the consequences of human activities on the climate. The ASDC currently supports more than 44 projects and has over 1,700 archived data sets, which increase daily. ASDC customers include scientists, researchers, federal, state, and local governments, academia, industry, and application users, the remote sensing community, and the general public.
Use-Inspired Data Information Services for NOAA's National Centers for Environmental Information
NASA Astrophysics Data System (ADS)
Owen, T.
2015-12-01
Leveraging environmental data and information to make specific, informed decisions is critical to the Nation's economy, environment, and public safety. The ability to successfully transform past and recent data into environmental intelligence is predicated on the articulation of use-inspired, actionable requirements for product and service development. With the formation of the National Centers for Environmental Information (NCEI), there is a unique opportunity to revolutionize the delivery of information services in support of customer requirements. Such delivery cuts across the disciplines of meteorology, geophysics, and oceanography, as well as regions and sectors for the United States. At NCEI, information services are based on a two-way dialogue that (i) raises awareness of environmental data products and services and (ii) captures user needs for product and services sustainment and development. To this end, NCEI information services has developed a formal process for collecting user needs and translating them into requirements. This process reflects economically-prevalent and regionally-focused sectors based on Census Bureau classifications.
Venus - Global View Centered at 180 degrees
1996-11-26
This global view of the surface of Venus is centered at 180 degrees east longitude. Magellan synthetic aperture radar mosaics from the first cycle of Magellan mapping, and a 5 degree latitude-longitude grid, are mapped onto a computer-simulated globe to create this image. Data gaps are filled with Pioneer-Venus Orbiter data, or a constant mid-range value. The image was produced by the Solar System Visualization project and the Magellan Science team at the JPL Multimission Image Processing Laboratory. http://photojournal.jpl.nasa.gov/catalog/PIA00478
STS-113 Mission Specialists review data on the P1 Truss
NASA Technical Reports Server (NTRS)
2002-01-01
KENNEDY SPACE CENTER, FLA. - STS-113 Mission Specialists Michael Lopez-Alegria (left) and John Herrington (center) review data on the P1 Integrated Truss Structure with a technician in the Space Station Processing Facility. During the mission, the P1 truss will be attached to the central truss segment, S0 Truss, during spacewalks. The payload also includes the Crew and Equipment Translation Aid (CETA) Cart B that can be used by spacewalkers to move along the truss with equipment. STS-113 is scheduled to launch Oct. 6, 2002.
NASA Astrophysics Data System (ADS)
Coleman, D. F.
2012-12-01
Most research vessels are equipped with satellite Internet services with bandwidths capable of being upgraded to support telepresence technologies and live shore-based participation. This capability can be used for real-time data transmission to shore, where it can be distributed, managed, processed, and archived. The University of Rhode Island Inner Space Center utilizes telepresence technologies and a growing network of command centers on Internet2 to participate live with a variety of research vessels and their ocean observing and sampling systems. High-bandwidth video streaming, voice-over-IP telecommunications, and real-time data feeds and file transfers enable users on shore to take part in the oceanographic expeditions as if they were present on the ship, working in the lab. Telepresence-enabled systematic ocean exploration and similar programs represent a significant and growing paradigm shift that can change the future of seagoing ocean observations using research vessels. The required platform is the ship itself, and users of the technology rely on the ship-based technical teams, but remote and distributed shore-based science users, students, educators, and the general public can now take part by being aboard virtually.
Public Library Automation Report: 1984.
ERIC Educational Resources Information Center
Gotanda, Masae
Data processing was introduced to public libraries in Hawaii in 1973 with a feasibility study which outlined the candidate areas for automation. Since then, the Office of Library Services has automated the order procedures for one of the largest book processing centers for public libraries in the country; created one of the first COM…
Small but Pristine--Lessons for Small Library Automation.
ERIC Educational Resources Information Center
Clement, Russell; Robertson, Dane
1990-01-01
Compares the more positive library automation experiences of a small public library with those of a large research library. Topics addressed include collection size; computer size and the need for outside control of a data processing center; staff size; selection process for hardware and software; and accountability. (LRW)
Computerized Serial Processing System at the University of California, Berkeley
ERIC Educational Resources Information Center
Silberstein, Stephen M.
1975-01-01
The extreme flexibility of the MARC format coupled with the simplicity of a batch-oriented processing system centered around a sequential master file has enabled the University of California, Berkeley, library to gradually build an unusually large serials data base in support of both technical and public services. (Author)
include any fuel derived from co-processing biomass with a feedstock that is not biomass. This tax credit renewable diesel does not include any fuel derived from co-processing biomass with a feedstock that is not ) Second Generation Biofuel Production Property Depreciation Allowance Expired: 12/31/2017 NOTE: This
Atmospheric Science Data Center
2014-05-15
... View Larger Image The Hayman fire, situated about 65 kilometers southwest of Denver, ... these visualizations were generated as part of operational processing at the Atmospheric Science Data Center at NASA Langley Research ...
Performance enhancement using a balanced scorecard in a Patient-centered Medical Home.
Fields, Scott A; Cohen, Deborah
2011-01-01
Oregon Health & Science University Family Medicine implemented a balanced scorecard within our clinics that embraces the inherent tensions between care quality, financial productivity, and operational efficiency. This data-driven performance improvement process involved: (1) consensus-building around specific indicators to be measured, (2) developing and refining the balanced scorecard, and (3) using the balanced scorecard in the quality improvement process. Developing and implementing the balanced scorecard stimulated an important culture shift among clinics; practice members now actively use data to recognize successes, understand emerging problems, and make changes in response to these problems. Our experience shows how Patient-centered Medical Homes can be enhanced through use of information technology and evidence-based tools that support improved decision making and performance and help practices develop into learning organizations.
NASA Astrophysics Data System (ADS)
Boler, F. M.; Blewitt, G.; Kreemer, C. W.; Bock, Y.; Noll, C. E.; McWhirter, J.; Jamason, P.; Squibb, M. B.
2010-12-01
Space geodetic science and other disciplines using geodetic products have benefited immensely from open sharing of data and metadata from global and regional archives Ten years ago Scripps Orbit and Permanent Array Center (SOPAC), the NASA Crustal Dynamics Data Information System (CDDIS), UNAVCO and other archives collaborated to create the GPS Seamless Archive Centers (GSAC) in an effort to further enable research with the expanding collections of GPS data then becoming available. The GSAC partners share metadata to facilitate data discovery and mining across participating archives and distribution of data to users. This effort was pioneering, but was built on technology that has now been rendered obsolete. As the number of geodetic observing technologies has expanded, the variety of data and data products has grown dramatically, exposing limitations in data product sharing. Through a NASA ROSES project, the three archives (CDDIS, SOPAC and UNAVCO) have been funded to expand the original GSAC capability for multiple geodetic observation types and to simultaneously modernize the underlying technology by implementing web services. The University of Nevada, Reno (UNR) will test the web services implementation by incorporating them into their daily GNSS data processing scheme. The effort will include new methods for quality control of current and legacy data that will be a product of the analysis/testing phase performed by UNR. The quality analysis by UNR will include a report of the stability of the stations coordinates over time that will enable data users to select sites suitable for their application, for example identifying stations with large seasonal effects. This effort will contribute to enhanced ability for very large networks to obtain complete data sets for processing.
Using Selection Pressure as an Asset to Develop Reusable, Adaptable Software Systems
NASA Technical Reports Server (NTRS)
Berrick, Stephen; Lynnes, Christopher
2007-01-01
The Goddard Earth Sciences Data and Information Services Center (GES DISC) at NASA has over the years developed and honed several reusable architectural components for supporting large-scale data centers with a large customer base. These include a processing system (S4PM) and an archive system (S4PA) based upon a workflow engine called the Simple Scalable Script based Science Processor (S4P) and an online data visualization and analysis system (Giovanni). These subsystems are currently reused internally in a variety of combinations to implement customized data management on behalf of instrument science teams and other science investigators. Some of these subsystems (S4P and S4PM) have also been reused by other data centers for operational science processing. Our experience has been that development and utilization of robust interoperable and reusable software systems can actually flourish in environments defined by heterogeneous commodity hardware systems the emphasis on value-added customer service and the continual goal for achieving higher cost efficiencies. The repeated internal reuse that is fostered by such an environment encourages and even forces changes to the software that make it more reusable and adaptable. Allowing and even encouraging such selective pressures to software development has been a key factor In the success of S4P and S4PM which are now available to the open source community under the NASA Open source Agreement
A Technical Survey on Optimization of Processing Geo Distributed Data
NASA Astrophysics Data System (ADS)
Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.
2018-04-01
With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.
Cloud services on an astronomy data center
NASA Astrophysics Data System (ADS)
Solar, Mauricio; Araya, Mauricio; Farias, Humberto; Mardones, Diego; Wang, Zhong
2016-08-01
The research on computational methods for astronomy performed by the first phase of the Chilean Virtual Observatory (ChiVO) led to the development of functional prototypes, implementing state-of-the-art computational methods and proposing new algorithms and techniques. The ChiVO software architecture is based on the use of the IVOA protocols and standards. These protocols and standards are grouped in layers, with emphasis on the application and data layers, because their basic standards define the minimum operation that a VO should conduct. As momentary verification, the current implementation works with a set of data, with 1 TB capacity, which comes from the reduction of the cycle 0 of ALMA. This research was mainly focused on spectroscopic data cubes coming from the cycle 0 ALMA's public data. As the dataset size increases when the cycle 1 ALMA's public data is also increasing every month, data processing is becoming a major bottleneck for scientific research in astronomy. When designing the ChiVO, we focused on improving both computation and I/ O costs, and this led us to configure a data center with 424 high speed cores of 2,6 GHz, 1 PB of storage (distributed in hard disk drives-HDD and solid state drive-SSD) and high speed communication Infiniband. We are developing a cloud based e-infrastructure for ChiVO services, in order to have a coherent framework for developing novel web services for on-line data processing in the ChiVO. We are currently parallelizing these new algorithms and techniques using HPC tools to speed up big data processing, and we will report our results in terms of data size, data distribution, number of cores and response time, in order to compare different processing and storage configurations.
The results of an agricultural analysis of the ERTS-1 MSS data at the Johnson Space Center
NASA Technical Reports Server (NTRS)
Bizzell, R. M.; Wade, L. C.; Prior, H. L.; Spiers, B.
1973-01-01
The initial analysis of the ERTS-1 multispectral scanner (MSS) data at the Johnson Space Center (JSC), Houston, Texas is discussed. The primary data set utilized was the scene over Monterey Bay, California, on July 25, 1972, NASA ERTS ID No. 1002-18134. It was submitted to both computerized and image interpretative processing. An area in the San Joaquin Valley was submitted to an intensive evaluation of the ability of the data to (1) discriminate between crop types and (2) to provide a reasonably accurate area measurement of agricultural features of interest. The results indicate that the ERTS-1 MSS data is capable of providing the identifications and area extent of agricultural lands and field crop types.
NASA Astrophysics Data System (ADS)
Agram, P. S.; Gurrola, E. M.; Lavalle, M.; Sacco, G. F.; Rosen, P. A.
2016-12-01
The InSAR Scientific Computing Environment (ISCE) provides both a modular, flexible, and extensible framework for building software components and applications that work together seamlessly as well as a toolbox for processing InSAR data into higher level geodetic image products from a diverse array of radar satellites and aircraft. ISCE easily scales to serve as the SAR processing engine at the core of the NASA JPL Advanced Rapid Imaging and Analysis (ARIA) Center for Natural Hazards as well as a software toolbox for individual scientists working with SAR data. ISCE is planned as the foundational element in processing NISAR data, enabling a new class of analyses that take greater advantage of the long time and large spatial scales of these data. ISCE in ARIA is also a SAR Foundry for development of new processing components and workflows to meet the needs of both large processing centers and individual users. The ISCE framework contains object-oriented Python components layered to construct Python InSAR components that manage legacy Fortran/C InSAR programs. The Python user interface enables both command-line deployment of workflows as well as an interactive "sand box" (the Python interpreter) where scientists can "play" with the data. Recent developments in ISCE include the addition of components to ingest Sentinel-1A SAR data (both stripmap and TOPS-mode) and a new workflow for processing the TOPS-mode data. New components are being developed to exploit polarimetric-SAR data to provide the ecosystem and land-cover/land-use change communities with rigorous and efficient tools to perform multi-temporal, polarimetric and tomographic analyses in order to generate calibrated, geocoded and mosaicked Level-2 and Level-3 products (e.g., maps of above-ground biomass or forest disturbance). ISCE has been downloaded by over 200 users by a license for WinSAR members through the Unavco.org website. Others may apply directly to JPL for a license at download.jpl.nasa.gov.
CINTEX: International Interoperability Extensions to EOSDIS
NASA Technical Reports Server (NTRS)
Graves, Sara J.
1997-01-01
A large part of the research under this cooperative agreement involved working with representatives of the DLR, NASDA, EDC, and NOAA-SAA data centers to propose a set of enhancements and additions to the EOSDIS Version 0 Information Management System (V0 IMS) Client/Server Message Protocol. Helen Conover of ITSL led this effort to provide for an additional geographic search specification (WRS Path/Row), data set- and data center-specific search criteria, search by granule ID, specification of data granule subsetting requests, data set-based ordering, and the addition of URLs to result messages. The V0 IMS Server Cookbook is an evolving document, providing resources and information to data centers setting up a VO IMS Server. Under this Cooperative Agreement, Helen Conover revised, reorganized, and expanded this document, and converted it to HTML. Ms. Conover has also worked extensively with the IRE RAS data center, CPSSI, in Russia. She served as the primary IMS contact for IRE-CPSSI and as IRE-CPSSI's liaison to other members of IMS and Web Gateway (WG) development teams. Her documentation of IMS problems in the IRE environment (Sun servers and low network bandwidth) led to a general restructuring of the V0 IMS Client message polling system. to the benefit of all IMS participants. In addition to the IMS server software and documentation. which are generally available to CINTEX sites, Ms. Conover also provided database design documentation and consulting, order tracking software, and hands-on testing and debug assistance to IRE. In the final pre-operational phase of IRE-CPSSI development, she also supplied information on configuration management, including ideas and processes in place at the Global Hydrology Resource Center (GHRC), an EOSDIS data center operated by ITSL.
Global Change Data Center: Mission, Organization, Major Activities, and 2003 Highlights
NASA Technical Reports Server (NTRS)
2004-01-01
Rapid, efficient access to Earth sciences data from satellites and ground validation stations is fundamental to the nation's efforts to understand the effects of global environmental changes and their implications for public policy. It becomes a bigger challenge in the future when data volumes increase from current levels to terabytes per day. Demands on data storage, data access, network throughput, processing power, and database and information management are increased by orders of magnitude, while budgets remain constant and even shrink.The Global Change Data Center's (GCDC) mission is to develop and operate data systems, generate science products, and provide archival and distribution services for Earth science data in support of the U.S. Global Change Program and NASA's Earth Sciences Enterprise. The ultimate product of the GCDC activities is access to data to support research, education, and public policy.
FTOOLS: A FITS Data Processing and Analysis Software Package
NASA Astrophysics Data System (ADS)
Blackburn, J. Kent; Greene, Emily A.; Pence, William
1993-05-01
FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities common to high energy astrophysics data sets. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.
Overview of the land analysis system (LAS)
Quirk, Bruce K.; Olseson, Lyndon R.
1987-01-01
The Land Analysis System (LAS) is a fully integrated digital analysis system designed to support remote sensing, image processing, and geographic information systems research. LAS is being developed through a cooperative effort between the National Aeronautics and Space Administration Goddard Space Flight Center and the U. S. Geological Survey Earth Resources Observation Systems (EROS) Data Center. LAS has over 275 analysis modules capable to performing input and output, radiometric correction, geometric registration, signal processing, logical operations, data transformation, classification, spatial analysis, nominal filtering, conversion between raster and vector data types, and display manipulation of image and ancillary data. LAS is currently implant using the Transportable Applications Executive (TAE). While TAE was designed primarily to be transportable, it still provides the necessary components for a standard user interface, terminal handling, input and output services, display management, and intersystem communications. With TAE the analyst uses the same interface to the processing modules regardless of the host computer or operating system. LAS was originally implemented at EROS on a Digital Equipment Corporation computer system under the Virtual Memorial System operating system with DeAnza displays and is presently being converted to run on a Gould Power Node and Sun workstation under the Berkeley System Distribution UNIX operating system.
High-temperature Strain Sensor and Mounting Development
NASA Technical Reports Server (NTRS)
Williams, W. Dan; Lei, Jih-Fen; Reardon, Lawrence F.; Krake, Keith; Lemcoe, M. M.; Holmes, Harlan K.; Moore, Thomas C., Sr.
1996-01-01
This report describes Government Work Package Task 29 (GWP29), whose purpose was to develop advanced strain gage technology in support of the National Aerospace Plane (NASP) Program. The focus was on advanced resistance strain gages with a temperature range from room temperature to 2000 F (1095 C) and on methods for reliably attaching these gages to the various materials anticipated for use in the NASP program. Because the NASP program required first-cycle data, the installed gages were not prestabilized or heat treated on the test coupons before first-cycle data were recorded. NASA Lewis Research Center, the lead center for GWP29, continued its development of the palladium-chromium gage; NASA Langley Research Center investigated a new concept gage using Kanthal A1; and the NASA Dryden Flight Research Center chose the well-known BCL-3 iron-chromium-aluminum gage. Each center then tested all three gages. The parameters investigated were apparent strain, drift strain, and gage factor as a function of temperature, plus gage size and survival rate over the test period. Although a significant effort was made to minimize the differences in test equipment between the three test sites (e.g., the same hardware and software were used for final data processing), the center employed different data acquisition systems and furnace configurations so that some inherent differences may be evident in the final results.
Terrestrial-based lidar beach topography of Fire Island, New York, June 2014
Brenner, Owen T.; Hapke, Cheryl J.; Lee, Kathryn G.; Kimbrow, Dustin R.
2016-02-19
The U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center (SPCMSC) in Florida and the USGS Lower Mississippi-Gulf Water Science Center (LMG WSC) in Montgomery, Alabama, collaborated to gather alongshore terrestrial-based lidar beach elevation data at Fire Island, New York. This high-resolution elevation dataset was collected on June 11, 2014, to characterize beach topography and document ongoing beach evolution and recovery, and is part of the ongoing beach monitoring within the Hurricane Sandy Supplemental Project GS2-2B. This USGS data series includes the resulting processed elevation point data (xyz) and an interpolated digital elevation model (DEM).
NASA Astrophysics Data System (ADS)
Alcott, G.; Kempler, S.; Lynnes, C.; Leptoukh, G.; Vollmer, B.; Berrick, S.
2008-12-01
NASA Earth Sciences Division (ESD), and its preceding Earth science organizations, has made great investments in the development and maintenance of data management systems, as well as information technologies, for the purpose of maximizing the use and usefulness of NASA generated Earth science data. Earth science information systems, evolving with the maturation and implementation of advancing technologies, reside at NASA data centers, known as Distributed Active Archive Centers (DAACs). With information management system infrastructure in place, and system data and user services already developed and operational, only very small delta costs are required to fully support data archival, processing, and data support services required by the recommended Decadal Study missions. This presentation describes the services and capabilities of the Goddard Space Flight Center (GSFC) Earth Sciences Data and Information Services Center (GES DISC) (one of NASAs DAACs) and their potential reuse for these future missions. After 14 years working with instrument teams and the broader science community, GES DISC personnel expertise in atmospheric, water cycle, and atmospheric modeling data and information services, as well as Earth science missions, information system engineering, operations, and user services have developed a series of modular, reusable data management components currently is use in several projects. The knowledge and experience gained at the GES DISC lend themselves to providing science driven information systems in the areas of aerosols, clouds, and atmospheric chemicals to be measured by recommended Decadal Survey missions. Available reusable capabilities include data archive and distribution (Simple, Scalable, Script-based, Science [S4] Product Archive aka S4PA), data processing (S4 Processor for Measurements aka S4PM), data search (Mirador), data browse, visualization, and analysis (Giovanni), and data mining services. In addition, recent enhancements, such as Open Geospatial Consortium (OGC), Inc. interoperability implementations and data fusion prototypes, will be described. As a result of the information management systems developed by NASAs GES DISC, not only are large cost savings realized through system reuse, but maintenance costs are also minimized due to the simplicity of their implementations.
NASA Technical Reports Server (NTRS)
1992-01-01
To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.
ERIC Educational Resources Information Center
Anderson, Marcia
2014-01-01
Many people assume that schools and childcare centers are environmentally safe places for children to learn. However, adverse health effects from pest allergy related illnesses or pesticide exposure incidents can demonstrate the need for safer and more effective pest management strategies. The goal of this research is to measure the efficacy of…
Consolidated Environmental Resource Database Information Process (CERDIP)
2015-11-19
Secretary of the Army for Installations, Energy and Environment [OASA(IE&E)] ESOH 5850 21st Street, Bldg 211, Second Floor Fort Belvoir, VA 22060-5938...Elizabeth J. Keysar 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) National Defense Center for Energy and Environment Operated by Concurrent...Markup Language NDCEE National Defense Center for Energy and Environment NFDD National Geospatial–Intelligence Agency Feature Data Dictionary
Ward, Stéphanie; Chow, Amanda Froehlich; Humbert, M Louise; Bélanger, Mathieu; Muhajarine, Nazeem; Vatanparast, Hassan; Leis, Anne
2018-06-01
The Healthy Start-Départ Santé intervention was developed to promote physical activity, gross motor skills and healthy eating among preschoolers attending childcare centers. This process evaluation aimed to report the reach, effectiveness, adoption, implementation and maintenance of the Healthy Start-Départ Santé intervention. The RE-AIM framework was used to guide this process evaluation. Data were collected across 140 childcare centers who received the Healthy Start-Départ Santé intervention in the provinces of Saskatchewan and New Brunswick, Canada. Quantitative data were collected through director questionnaires at 10 months and 2 years after the initial training and analyzed using descriptive statistics. Qualitative data were collected throughout the intervention. The intervention was successful in reaching a large number of childcare centres and engaging both rural and urban communities across Saskatchewan and New Brunswick. Centres reported increasing opportunities for physical activity and healthy eating, which were generally low-cost, easy and quick to implement. However, these changes were rarely transformed into formal written policies. A total of 87% of centers reported using the physical activity resource and 68% using the nutrition resource on a weekly basis. Implementation fidelity of the initial training was high. Of those centers who received the initial training, 75% participated in the mid-point booster session training. Two year post-implementation questionnaires indicated that 47% of centers were still using the Active Play Equipment kit, while 42% were still using the physical activity resource and 37% were still using the nutrition resource. Key challenges to implementation and sustainability identified during the evaluation were consistent among all of the REAIM elements. These challenges included lack of time, lack of support from childcare staff and low parental engagement. Findings from this study suggest the implementation of Healthy Start-Départ Santé may be improved further by addressing resistance to change and varied levels of engagement among childcare staff. In addition, further work is needed to provide parents with opportunities to engage in HSDS with their children. Copyright © 2018 Elsevier Ltd. All rights reserved.
National Centers for Environmental Prediction
Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Contacts Change Log Events Calendar People Numerical Forecast Systems Ensemble and Post Processing Team
A phenomenological investigation of science center exhibition developers' expertise development
NASA Astrophysics Data System (ADS)
Young, Denise L.
The purpose of this study was to examine the exhibition developer role in the context of United States (U.S.) science centers, and more specifically, to investigate the way science center exhibition developers build their professional expertise. This research investigated how successfully practicing exhibition developers described their current practices, how they learned to be exhibition developers, and what factors were the most important to the developers in building their professional expertise. Qualitative data was gathered from 10 currently practicing exhibition developers from three science centers: the Exploratorium, San Francisco, California; the Field Museum, Chicago, Illinois; and the Science Museum of Minnesota, St. Paul, Minnesota. In-depth, semistructured interviews were used to collect the data. The study embraced aspects of the phenomenological tradition and sought to derive a holistic understanding of the position and how expertise was built for it. The data were methodically coded and organized into themes prior to analysis. The data analysis found that the position consisted of numerous and varied activities, but the developers' primary roles were advocating for the visitor, storytelling, and mediating information and ideas. They conducted these activities in the context of a team and relied on an established exhibition planning process to guide their work. Developers described a process of learning exhibition development that was experiential in nature. Learning through daily practice was key, though they also consulted with mentors and relied on visitor studies to gauge the effectiveness of their work. They were adept at integrating prior knowledge gained from many aspects of their lives into their practice. The developers described several internal factors that contributed to their expertise development including the desire to help others, a natural curiosity about the world, a commitment to learning, and the ability to accept critique. They expressed high levels of job satisfaction and a desire to continue in the position. The study findings have several implications for the practice of exhibition development, including grounding it in a defined exhibition planning process, providing mentors and other resources for learning, and improving upon museum studies programs by providing avenues for exhibition development practice in the science center context.
Lingard, E A; Berven, S; Katz, J N
2000-06-01
To examine variation in the process of care for total knee arthroplasty (TKA) and to highlight the need for rigorous research into the ideal management of TKA. We hypothesize that variation in the process of care for TKA across and within health care systems is associated with identifiable financial and historical factors. We compared access to TKA and typical postoperative rehabilitation management in 12 orthopedic centers in the United States (4 centers), United Kingdom (6 centers), and Australia (2 centers). We collected data from two sources: 1) Empirical data on length of stay and discharge management were collected as part of a prospective study of the outcomes of primary TKA for patients with a diagnosis of osteoarthritis; 2) Structured qualitative interviews were conducted at each of the participating centers to collect data on academic status and reimbursement structure, as well as waiting times for orthopedic consultation and TKA surgery once it had been scheduled. We demonstrated differences in length of acute hospital stay, use of extended care facilities, home physical therapy, and outpatient physical therapy within our cohort of hospitals. The publicly funded hospitals had a significantly longer acute hospital length of stay (mean 11.8 days, SD 7.1) than the private hospitals (mean 6.6 days, SD 4.1; P < 0.0001). Variation in waiting times was associated with the method of surgeon reimbursement and whether the hospital is publicly funded or private. Patients attending private hospitals waited 1-8 weeks for the first consultation and 2-12 weeks for a surgical date after scheduling. In contrast, patients attending publicly funded hospitals waited 4-12 months for a first consultation and 12-18 months for a surgical date after scheduling. Our observations are consistent with the hypothesis that financial reimbursement schemes influence the management of TKA. Further research needs to be done to quantify effects of varying processes of care on the outcome of TKA surgery across different health care settings. This data would elucidate the optimal management of TKA using objective evidence rather than relying on financial incentives or the preservation of historical practices.
The TESS Science Processing Operations Center
NASA Technical Reports Server (NTRS)
Jenkins, Jon; Twicken, Joseph D.; McCauliff, Sean; Campbell, Jennifer; Sanderfer, Dwight; Lung, David; Mansouri-Samani, Masoud; Girouard, Forrest; Tenenbaum, Peter; Klaus, Todd;
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will conduct a search for Earth’s closest cousins starting in late 2017. TESS will discover approx.1,000 small planets and measure the masses of at least 50 of these small worlds. The Science Processing Operations Center (SPOC) is being developed based on the Kepler science pipeline and will generate calibrated pixels and light curves on the NAS Pleiades supercomputer. The SPOC will search for periodic transit events and generate validation products for the transit-like features in the light curves. All TESS SPOC data products will be archived to the Mikulski Archive for Space Telescopes.
Computerizing an integrated clinical and financial record system in a CMHC: a pilot project.
Newkham, J; Bawcom, L
1981-01-01
The authors describe the three-year experience of a mid-sized community mental health center in designing and installing an automated Staff/Management Information System (S/MIS). The purpose of the project, piloted at the heart od Texas Region Mental Health Mental Retardation Center (HOTRMHMR) in Waco, Texas, was to examine the feasibility of a comprehensive data system operating at a local level which would create an effective audit trail for services and reimbursement and serve as a viable mechanism for the transmission of center data to a state system via computer tapes. Included in the discussion are agency philosophy, costs, management attitudes, the design and implementation process, and special features which evolved from the fully integrated system.
Optoelectronic scanning system upgrade by energy center localization methods
NASA Astrophysics Data System (ADS)
Flores-Fuentes, W.; Sergiyenko, O.; Rodriguez-Quiñonez, J. C.; Rivas-López, M.; Hernández-Balbuena, D.; Básaca-Preciado, L. C.; Lindner, L.; González-Navarro, F. F.
2016-11-01
A problem of upgrading an optoelectronic scanning system with digital post-processing of the signal based on adequate methods of energy center localization is considered. An improved dynamic triangulation analysis technique is proposed by an example of industrial infrastructure damage detection. A modification of our previously published method aimed at searching for the energy center of an optoelectronic signal is described. Application of the artificial intelligence algorithm of compensation for the error of determining the angular coordinate in calculating the spatial coordinate through dynamic triangulation is demonstrated. Five energy center localization methods are developed and tested to select the best method. After implementation of these methods, digital compensation for the measurement error, and statistical data analysis, a non-parametric behavior of the data is identified. The Wilcoxon signed rank test is applied to improve the result further. For optical scanning systems, it is necessary to detect a light emitter mounted on the infrastructure being investigated to calculate its spatial coordinate by the energy center localization method.
NASA Astrophysics Data System (ADS)
Moreaux, Guilhem; Lemoine, Frank G.; Capdeville, Hugues; Kuzin, Sergey; Otten, Michiel; Štěpánek, Petr; Willis, Pascal; Ferrage, Pascale
2016-12-01
In preparation of the 2014 realization of the International Terrestrial Reference Frame (ITRF2014), the International DORIS Service delivered to the International Earth Rotation and Reference Systems Service a set of 1140 weekly solution files including station coordinates and Earth orientation parameters, covering the time period from 1993.0 to 2015.0. The data come from eleven DORIS satellites: TOPEX/Poseidon, SPOT2, SPOT3, SPOT4, SPOT5, Envisat, Jason-1, Jason-2, Cryosat-2, Saral and HY-2A. In their processing, the six analysis centers which contributed to the DORIS combined solution used the latest time variable gravity models and estimated DORIS ground beacon frequency variations. Furthermore, all the analysis centers but one excepted included in their processing phase center variations for ground antennas. The main objective of this study is to present the combination process and to analyze the impact of the new modeling on the performance of the new combined solution. Comparisons with the IDS contribution to ITRF2008 show that (i) the application of the DORIS ground phase center variations in the data processing shifts the combined scale upward by nearly 7-11 mm and (ii) thanks to estimation of DORIS ground beacon frequency variations, the new combined solution no longer shows any scale discontinuity in early 2002 and does not present unexplained vertical discontinuities in any station position time series. However, analysis of the new series with respect to ITRF2008 exhibits a scale increase late 2011 which is not yet explained. A new DORIS Terrestrial Reference Frame was computed to evaluate the intrinsic quality of the new combined solution. That evaluation shows that the addition of data from the new missions equipped with the latest generation of DORIS receiver (Jason-2, Cryosat-2, HY-2A, Saral), results in an internal position consistency of 10 mm or better after mid-2008.
National Centers for Environmental Prediction
Statistics Observational Data Processing Data Assimilation Monsoon Desk Model Transition Seminars Seminar Installation at IMD July, 2011 - IMD in New Delhi, India The Hurricane Weather Research and Forecasting (HWRF ) 5830 University Research Court College Park, MD 20740 Page Author: EMC Webmaster Page generated:Sunday
Gareen, Ilana F; Sicks, JoRean D; Jain, Amanda Adams; Moline, Denise; Coffman-Kadish, Nancy
2013-01-01
In clinical trials and epidemiologic studies, information on medical care utilization and health outcomes is often obtained from medical records. For multi-center studies, this information may be gathered by personnel at individual sites or by staff at a central coordinating center. We describe the process used to develop a HIPAA-compliant centralized process to collect medical record information for a large multi-center cancer screening trial. The framework used to select, request, and track medical records incorporated a participant questionnaire with unique identifiers for each medical provider. De-identified information from the questionnaires was sent to the coordinating center indexed by these identifiers. The central coordinating center selected specific medical providers for abstraction and notified sites using these identifiers. The site personnel then linked the identifiers with medical provider information. Staff at the sites collected medical records and provided them for central abstraction. Medical records were successfully obtained and abstracted to ascertain information on outcomes and health care utilization in a study with over 18,000 study participants. Collection of records required for outcomes related to positive screening examinations and lung cancer diagnosis exceeded 90%. Collection of records for all aims was 87.32%. We designed a successful centralized medical record abstraction process that may be generalized to other research settings, including observational studies. The coordinating center received no identifying data. The process satisfied requirements imposed by the Health Insurance Portability and Accountability Act and concerns of site institutional review boards with respect to protected health information. Copyright © 2012 Elsevier Inc. All rights reserved.
Gareen, Ilana F.; Sicks, JoRean; Adams, Amanda; Moline, Denise; Coffman-Kadish, Nancy
2012-01-01
Background In clinical trials and epidemiologic studies, information on medical care utilization and health outcomes is often obtained from medical records. For multi-center studies, this information may be gathered by personnel at individual sites or by staff at a central coordinating center. We describe the process used to develop a HIPAA-compliant centralized process to collect medical record information for a large multi-center cancer screening trial. Methods The framework used to select, request, and track medical records incorporated a participant questionnaire with unique identifiers for each medical provider. De-identified information from the questionnaires was sent to the coordinating center indexed by these identifiers. The central coordinating center selected specific medical providers for abstraction and notified sites using these identifiers. The site personnel then linked the identifiers with medical provider information. Staff at the sites collected medical records and provided them for central abstraction. Results Medical records were successfully obtained and abstracted to ascertain information on outcomes and health care utilization in a study with over 18,000 study participants. Collection of records required for outcomes related to positive screening examinations and lung cancer diagnosis exceeded 90%. Collection of records for all aims was 87.32%. Conclusions We designed a successful centralized medical record abstraction process that may be generalized to other research settings, including observational studies. The coordinating center received no identifying data. The process satisfied requirements imposed by the Health Insurance Portability and Accountability Act and concerns of site institutional review boards with respect to protected health information. PMID:22982342
Jayapandian, Catherine P.; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D.; Zhang, Guo-Qiang; Sahoo, Satya S.
2013-01-01
Epilepsy is the most common serious neurological disorder affecting 50–60 million persons worldwide. Multi-modal electrophysiological data, such as electroencephalography (EEG) and electrocardiography (EKG), are central to effective patient care and clinical research in epilepsy. Electrophysiological data is an example of clinical “big data” consisting of more than 100 multi-channel signals with recordings from each patient generating 5–10GB of data. Current approaches to store and analyze signal data using standalone tools, such as Nihon Kohden neurology software, are inadequate to meet the growing volume of data and the need for supporting multi-center collaborative studies with real time and interactive access. We introduce the Cloudwave platform in this paper that features a Web-based intuitive signal analysis interface integrated with a Hadoop-based data processing module implemented on clinical data stored in a “private cloud”. Cloudwave has been developed as part of the National Institute of Neurological Disorders and Strokes (NINDS) funded multi-center Prevention and Risk Identification of SUDEP Mortality (PRISM) project. The Cloudwave visualization interface provides real-time rendering of multi-modal signals with “montages” for EEG feature characterization over 2TB of patient data generated at the Case University Hospital Epilepsy Monitoring Unit. Results from performance evaluation of the Cloudwave Hadoop data processing module demonstrate one order of magnitude improvement in performance over 77GB of patient data. (Cloudwave project: http://prism.case.edu/prism/index.php/Cloudwave) PMID:24551370
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Lezberg, E. A.
1976-01-01
Atmospheric trace constituents in the upper troposphere and lower stratosphere are now being measured as part of the NASA Global Atmospheric Sampling Program (GASP), using fully automated air sampling systems on board commercial 747 aircraft in routine airline service. Measurements of atmospheric ozone and related meteorological and flight information obtained during several GASP flights in March 1975 are now available from the National Climatic Center, Asheville, North Carolina. In addition to the data from the aircraft, tropopause pressure data obtained from the National Meteorological Center (NMC) archives for the dates of the flights are included. This report is the first of a series of reports which describes the data currently available from GASP, including flight routes and dates, instrumentation, the data processing procedure used, and data tape specifications.
Using Selection Pressure as an Asset to Develop Reusable, Adaptable Software Systems
NASA Astrophysics Data System (ADS)
Berrick, S. W.; Lynnes, C.
2007-12-01
The Goddard Earth Sciences Data and Information Services Center (GES DISC) at NASA has over the years developed and honed a number of reusable architectural components for supporting large-scale data centers with a large customer base. These include a processing system (S4PM) and an archive system (S4PA) based upon a workflow engine called the Simple, Scalable, Script-based Science Processor (S4P); an online data visualization and analysis system (Giovanni); and the radically simple and fast data search tool, Mirador. These subsystems are currently reused internally in a variety of combinations to implement customized data management on behalf of instrument science teams and other science investigators. Some of these subsystems (S4P and S4PM) have also been reused by other data centers for operational science processing. Our experience has been that development and utilization of robust, interoperable, and reusable software systems can actually flourish in environments defined by heterogeneous commodity hardware systems, the emphasis on value-added customer service, and continual cost reduction pressures. The repeated internal reuse that is fostered by such an environment encourages and even forces changes to the software that make it more reusable and adaptable. Allowing and even encouraging such selective pressures to software development has been a key factor in the success of S4P and S4PM, which are now available to the open source community under the NASA Open Source Agreement.
Modified Polar-Format Software for Processing SAR Data
NASA Technical Reports Server (NTRS)
Chen, Curtis
2003-01-01
HMPF is a computer program that implements a modified polar-format algorithm for processing data from spaceborne synthetic-aperture radar (SAR) systems. Unlike prior polar-format processing algorithms, this algorithm is based on the assumption that the radar signal wavefronts are spherical rather than planar. The algorithm provides for resampling of SAR pulse data from slant range to radial distance from the center of a reference sphere that is nominally the local Earth surface. Then, invoking the projection-slice theorem, the resampled pulse data are Fourier-transformed over radial distance, arranged in the wavenumber domain according to the acquisition geometry, resampled to a Cartesian grid, and inverse-Fourier-transformed. The result of this process is the focused SAR image. HMPF, and perhaps other programs that implement variants of the algorithm, may give better accuracy than do prior algorithms for processing strip-map SAR data from high altitudes and may give better phase preservation relative to prior polar-format algorithms for processing spotlight-mode SAR data.
Data sets for snow cover monitoring and modelling from the National Snow and Ice Data Center
NASA Astrophysics Data System (ADS)
Holm, M.; Daniels, K.; Scott, D.; McLean, B.; Weaver, R.
2003-04-01
A wide range of snow cover monitoring and modelling data sets are pending or are currently available from the National Snow and Ice Data Center (NSIDC). In-situ observations support validation experiments that enhance the accuracy of remote sensing data. In addition, remote sensing data are available in near-real time, providing coarse-resolution snow monitoring capability. Time series data beginning in 1966 are valuable for modelling efforts. NSIDC holdings include SMMR and SSM/I snow cover data, MODIS snow cover extent products, in-situ and satellite data collected for NASA's recent Cold Land Processes Experiment, and soon-to-be-released ASMR-E passive microwave products. The AMSR-E and MODIS sensors are part of NASA's Earth Observing System flying on the Terra and Aqua satellites Characteristics of these NSIDC-held data sets, appropriateness of products for specific applications, and data set access and availability will be presented.
NASA Astrophysics Data System (ADS)
Early, A. B.; Chen, G.; Beach, A. L., III; Northup, E. A.
2016-12-01
NASA has conducted airborne tropospheric chemistry studies for over three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center in Hampton Virginia originally developed the Toolsets for Airborne Data (TAD) web application in September 2013 to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. The analysis of airborne data typically requires data subsetting, which can be challenging and resource intensive for end users. In an effort to streamline this process, the TAD toolset enhancements will include new data subsetting features and updates to the current database model. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. This effort will allow for the automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The development of these enhancements will be discussed in this presentation.
GOME and Sciamachy data access using the Netherlands Sciamachy Data Center
NASA Astrophysics Data System (ADS)
Som de Cerff, Wim; de Vreede, Ernst; van de Vegte, John; van Hees, Ricard; van der Neut, Ian; Stammes, Piet; Pieters, Ankie; van der A, Ronald
2010-05-01
The Netherlands Sciamachy Data Center (NL-SCIA-DC) provides access to satellite data from the GOME and Sciamachy instruments for over 10 years now. GOME and Sciamachy both measure trace gases like Ozone, Methane, NO2 and aerosols, which are important for climate and air quality monitoring. Recently (February 2010) a new release of the NL-SCIA-DC provides an improved processing and archiving structure and an improved user interface. This Java Webstart application allows the user to browse, query and download GOME and Sciamachy data products, including KNMI and SRON GOME and Sciamachy products (cloud products, CH4, NO2, CO). Data can be searched on file and pixel level, and can be graphically displayed. The huge database containing all pixel information of GOME and Sciamachy is unique and allows specific selection, e.g., selecting cloud free pixels. Ordered data is delivered by FTP or email. The data available spans the mission times of GOME and Sciamachy, and is constantly updated as new data becomes available. The data services future upgrades include offering additional functionality to end-users of Sciamachy data. One of the functionalities provided will be the possibility to select and process Sciamachy products using different data processors, using Grid technology. This technology was successfully researched and will be made operationally available in the near future.
DOT National Transportation Integrated Search
2008-01-01
Designing and implementing effective traffic safety policies : requires data-driven analysis of traffic collisions. To help in the : policy-making process, the Indiana University Public Policy : Institute, Center for Criminal Justice Research (CCJR o...
DOT National Transportation Integrated Search
2006-01-01
Designing and implementing effective traffic safety policies : requires data-driven analysis of traffic collisions. To help in the : policy-making process, the Indiana University Public Policy : Institute, Center for Criminal Justice Research (CCJR o...
76 FR 7868 - Center for Scientific Review; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
... Special Emphasis Panel, Small Business: Computational Biology, Image Processing and Data Mining. Date... for Scientific Review Special Emphasis Panel, Quick Trial on Imaging and Image-Guided Intervention...
DOT National Transportation Integrated Search
1998-01-01
Designing and implementing effective traffic safety policies : requires data-driven analysis of traffic collisions. To help in the : policy-making process, the Indiana University Public Policy : Institute, Center for Criminal Justice Research (CCJR o...
DOT National Transportation Integrated Search
2007-01-01
Designing and implementing effective traffic safety policies : requires data-driven analysis of traffic collisions. To help in the : policy-making process, the Indiana University Public Policy : Institute, Center for Criminal Justice Research (CCJR o...
1978 bibliography of atomic and molecular processes. [Bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This annotated bibliography lists 2557 works on atomic and molecular processes reported in publications dated 1978. Sources include scientific journals, conference proceedings, and books. Each entry is designated by one or more of the 114 categories of atomic and molecular processes used by the Controlled Fusion Atomic Data Center to classify data. Also indicated is whether the work was experimental or theoretical, what energy range was covered, what reactants were investigated, and the country of origin of the first author. Following the bibliographical listing are indexes of reactants and authors.
1979 bibliography of atomic and molecular processes. [Bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1980-08-01
This annotated bibliography lists 2146 works on atomic and molecular processes reported in publications dated 1979. Sources include scientific journals, conference proceedings, and books. Each entry is designated by one or more of the 114 categories of atomic and molecular processes used by the Controlled Fusion Atomic Data Center, Oak Ridge National Laboratory, to classify data. Also indicated is whether the work was experimental or theoretical, what energy range was covered, what reactants were investigated, and the country of origin of the first author. Following the bibliographical listing are indexes of reactants and authors.
ERIC Educational Resources Information Center
National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.
This report is a part of the program of the National Center for Health Statistics to provide current statistics as baseline data for the evaluation, planning, and administration of health programs. Part I presents data concerning the occupational fields: (1) administration, (2) anthropology and sociology, (3) data processing, (4) basic sciences,…
ERIC Educational Resources Information Center
Association for Educational Data Systems, Washington, DC.
The theme of the 1976 convention of the Association for Educational Data Systems (AEDS) was educational data processing and information systems. Special attention was focused on educational management information systems, computer centers and networks, computer assisted instruction, computerized testing, guidance, and higher education. This…
Operating tool for a distributed data and information management system
NASA Astrophysics Data System (ADS)
Reck, C.; Mikusch, E.; Kiemle, S.; Wolfmüller, M.; Böttcher, M.
2002-07-01
The German Remote Sensing Data Center has developed the Data Information and Management System DIMS which provides multi-mission ground system services for earth observation product processing, archiving, ordering and delivery. DIMS successfully uses newest technologies within its services. This paper presents the solution taken to simplify operation tasks for this large and distributed system.
Nimbus/TOMS Science Data Operations Support
NASA Technical Reports Server (NTRS)
Childs, Jeff
1998-01-01
1. Participate in and provide analysis of laboratory and in-flight calibration of UV sensors used for space observations of backscattered UV radiation. 2. Provide support to the TOMS Science Operations Center, including generating instrument command lists and analysis of TOMS health and safety data. 3. Develop and maintain software and algorithms designed to capture and process raw spacecraft and instrument data, convert the instrument output into measured radiance and irradiances, and produce scientifically valid products. 4. Process the TOMS data into Level 1, Level 2, and Level 3 data products. 5. Provide analysis of the science data products in support of NASA GSFC Code 916's research.
Nimbus/TOMS Science Data Operations Support
NASA Technical Reports Server (NTRS)
1998-01-01
Projected goals include the following: (1) Participate in and provide analysis of laboratory and in-flight calibration of LTV sensors used for space observations of backscattered LTV radiation; (2) Provide support to the TOMS Science Operations Center, including generating instrument command lists and analysis of TOMS health and safety data; (3) Develop and maintain software and algorithms designed to capture and process raw spacecraft and instrument data, convert the instrument output into measured radiance and irradiances, and produce scientifically valid products; (4) Process the TOMS data into Level 1, Level 2, and Level 3 data products; (5) Provide analysis of the science data products in support of NASA GSFC Code 916's research.
User's guide to the Nimbus-4 backscatter ultraviolet experiment data sets
NASA Technical Reports Server (NTRS)
Lowrey, B. E.
1978-01-01
The first year's data from the Nimbus 4 backscatter ultraviolet (BUV) experiment have been archived in the National Space Science Data Center (NSSDC). Backscattered radiances in the ultraviolet measured by the satellite were used to compute the global total ozone for the period April 1970 - April 1971. The data sets now in the NSSDC are the results obtained by the Ozone Processing Team, which has processed the data with the purpose of determining the best quality of the data. There are four basic sets of data available in the NSSDC representing various stages in processing. The primary data base contains organized and cleaned data in telemetry units. The radiance data has had most of the engineering calibrations performed. The detailed total ozone data is the result of computations to obtain the total ozone; the Compressed Total Ozone data is a convenient condensation of the detailed total ozone. Product data sets are also included.
Staudacher, Ingo; Nalpathamkalam, Asha Roy; Uhlmann, Lorenz; Illg, Claudius; Seehausen, Sebastian; Akhavanpoor, Mohammadreza; Buchauer, Anke; Geis, Nicolas; Lugenbiel, Patrick; Schweizer, Patrick A; Xynogalos, Panagiotis; Zylla, Maura M; Scholz, Eberhard; Zitron, Edgar; Katus, Hugo A; Thomas, Dierk
2017-10-11
Increasing numbers of patients with cardiovascular implantable electronic devices (CIEDs) and limited follow-up capacities highlight unmet challenges in clinical electrophysiology. Integrated software (MediConnect ® ) enabling fully digital processing of device interrogation data has been commercially developed to facilitate follow-up visits. We sought to assess feasibility of fully digital data processing (FDDP) during ambulatory device follow-up in a high-volume tertiary hospital to provide guidance for future users of FDDP software. A total of 391 patients (mean age, 70 years) presenting to the outpatient department for routine device follow-up were analyzed (pacemaker, 44%; implantable cardioverter defibrillator, 39%; cardiac resynchronization therapy device, 16%). Quality of data transfer and follow-up duration were compared between digital (n = 265) and manual processing of device data (n = 126). Digital data import was successful, complete and correct in 82% of cases when early software versions were used. When using the most recent software version the rate of successful digital data import increased to 100%. Software-based import of interrogation data was complete and without failure in 97% of cases. The mean duration of a follow-up visit did not differ between the two groups (digital 18.7 min vs. manual data transfer 18.2 min). FDDP software was successfully implemented into the ambulatory follow-up of patients with implanted pacemakers and defibrillators. Digital data import into electronic patient management software was feasible and supported the physician's workflow. The total duration of follow-up visits comprising technical device interrogation and clinical actions was not affected in the present tertiary center outpatient cohort.
NASA Astrophysics Data System (ADS)
Hartkorn, O. A.; Ritter, B.; Meskers, A. J. H.; Miles, O.; Russwurm, M.; Scully, S.; Roldan, A.; Juestel, P.; Reville, V.; Lupu, S.; Ruffenach, A.
2014-12-01
The Earth's magnetosphere is formed as a consequence of the interaction between the planet's magnetic field and the solar wind, a continuous plasma stream from the Sun. A number of different solar wind phenomena have been studied over the past forty years with the intention of understandingand forcasting solar behavior and space weather. In particular, Earth-bound interplanetary coronal mass ejections (CMEs) can significantly disturb the Earth's magnetosphere for a short time and cause geomagnetic storms. We present a mission concept consisting of six spacecraft that are equally spaced in a heliocentric orbit at 0.72 AU. These spacecraft will monitor the plasma properties, the magnetic field's orientation and magnitude, and the 3D-propagation trajectory of CMEs heading for Earth. The primary objective of this mission is to increase space weather forecasting time by means of a near real-time information service, that is based upon in-situ and remote measurements of the CME properties. The mission secondary objective is the improvement of scientific space weather models. In-situ measurements are performed using a Solar Wind Analyzer instrumentation package and flux gate magnetometers. For remote measurements, coronagraphs are employed. The proposed instruments originate from other space missions with the intention to reduce mission costs and to streamline the mission design process. Communication with the six identical spacecraft is realized via a deep space network consisting of six ground stations. This network provides an information service that is in uninterrupted contact with the spacecraft, allowing for continuos space weather monitoring. A dedicated data processing center will handle all the data, and forward the processed data to the SSA Space Weather Coordination Center. This organization will inform the general public through a space weather forecast. The data processing center will additionally archive the data for the scientific community. This concept mission allows for major advances in space weather forecasting and the scientific modeling of space weather.
NASA Astrophysics Data System (ADS)
Doyle, Paul; Mtenzi, Fred; Smith, Niall; Collins, Adrian; O'Shea, Brendan
2012-09-01
The scientific community is in the midst of a data analysis crisis. The increasing capacity of scientific CCD instrumentation and their falling costs is contributing to an explosive generation of raw photometric data. This data must go through a process of cleaning and reduction before it can be used for high precision photometric analysis. Many existing data processing pipelines either assume a relatively small dataset or are batch processed by a High Performance Computing centre. A radical overhaul of these processing pipelines is required to allow reduction and cleaning rates to process terabyte sized datasets at near capture rates using an elastic processing architecture. The ability to access computing resources and to allow them to grow and shrink as demand fluctuates is essential, as is exploiting the parallel nature of the datasets. A distributed data processing pipeline is required. It should incorporate lossless data compression, allow for data segmentation and support processing of data segments in parallel. Academic institutes can collaborate and provide an elastic computing model without the requirement for large centralized high performance computing data centers. This paper demonstrates how a base 10 order of magnitude improvement in overall processing time has been achieved using the "ACN pipeline", a distributed pipeline spanning multiple academic institutes.
Monitoring Disasters by Use of Instrumented Robotic Aircraft
NASA Technical Reports Server (NTRS)
Wegener, Steven S.; Sullivan, Donald V.; Dunagan, Steven E.; Brass, James A.; Ambrosia, Vincent G.; Buechel, Sally W.; Stoneburner, Jay; Schoenung, Susan M.
2009-01-01
Efforts are under way to develop data-acquisition, data-processing, and data-communication systems for monitoring disasters over large geographic areas by use of uninhabited aerial systems (UAS) robotic aircraft that are typically piloted by remote control. As integral parts of advanced, comprehensive disaster- management programs, these systems would provide (1) real-time data that would be used to coordinate responses to current disasters and (2) recorded data that would be used to model disasters for the purpose of mitigating the effects of future disasters and planning responses to them. The basic idea is to equip UAS with sensors (e.g., conventional video cameras and/or multispectral imaging instruments) and to fly them over disaster areas, where they could transmit data by radio to command centers. Transmission could occur along direct line-of-sight paths and/or along over-the-horizon paths by relay via spacecraft in orbit around the Earth. The initial focus is on demonstrating systems for monitoring wildfires; other disasters to which these developments are expected to be applicable include floods, hurricanes, tornadoes, earthquakes, volcanic eruptions, leaks of toxic chemicals, and military attacks. The figure depicts a typical system for monitoring a wildfire. In this case, instruments aboard a UAS would generate calibrated thermal-infrared digital image data of terrain affected by a wildfire. The data would be sent by radio via satellite to a data-archive server and image-processing computers. In the image-processing computers, the data would be rapidly geo-rectified for processing by one or more of a large variety of geographic-information- system (GIS) and/or image-analysis software packages. After processing by this software, the data would be both stored in the archive and distributed through standard Internet connections to a disaster-mitigation center, an investigator, and/or command center at the scene of the fire. Ground assets (in this case, firefighters and/or firefighting equipment) would also be monitored in real time by use of Global Positioning System (GPS) units and radio communication links between the assets and the UAS. In this scenario, the UAS would serve as a data-relay station in the sky, sending packets of information concerning the locations of assets to the image-processing computer, wherein this information would be incorporated into the geo-rectified images and maps. Hence, the images and maps would enable command-center personnel to monitor locations of assets in real time and in relation to locations affected by the disaster. Optionally, in case of a disaster that disrupted communications, the UAS could be used as an airborne communication relay station to partly restore communications to the affected area. A prototype of a system of this type was demonstrated in a project denoted the First Response Experiment (Project FiRE). In this project, a controlled outdoor fire was observed by use of a thermal multispectral scanning imager on a UAS that delivered image data to a ground station via a satellite uplink/ downlink telemetry system. At the ground station, the image data were geo-rectified in nearly real time for distribution via the Internet to firefighting managers. Project FiRE was deemed a success in demonstrating several advances essential to the eventual success of the continuing development effort.
1983-03-01
Sysiem are: Order processinq coordinators Order processing management Credit and collections Accounts receivable Support management Admin ianagemenr...or sales secretary, then by order processing (OP). Phone-in orders go directly to OP. The infor- mation is next Transcribed onto an order entry... ORDER PROCESSING : The central systems validate The order items and codes t!, processing them against the customer file, the prodicT or PA? ts file, and
NASA Technical Reports Server (NTRS)
Mahmot, Ron; Koslosky, John T.; Beach, Edward; Schwarz, Barbara
1994-01-01
The Mission Operations Division (MOD) at Goddard Space Flight Center builds Mission Operations Centers which are used by Flight Operations Teams to monitor and control satellites. Reducing system life cycle costs through software reuse has always been a priority of the MOD. The MOD's Transportable Payload Operations Control Center development team established an extensive library of 14 subsystems with over 100,000 delivered source instructions of reusable, generic software components. Nine TPOCC-based control centers to date support 11 satellites and achieved an average software reuse level of more than 75 percent. This paper shares experiences of how the TPOCC building blocks were developed and how building block developer's, mission development teams, and users are all part of the process.
Drew, L.J.; Grunsky, E.C.; Sutphin, D.M.; Woodruff, L.G.
2010-01-01
Soils collected in 2004 along two North American continental-scale transects were subjected to geochemical and mineralogical analyses. In previous interpretations of these analyses, data were expressed in weight percent and parts per million, and thus were subject to the effect of the constant-sum phenomenon. In a new approach to the data, this effect was removed by using centered log-ratio transformations to 'open' the mineralogical and geochemical arrays. Multivariate analyses, including principal component and linear discriminant analyses, of the centered log-ratio data reveal the effects of soil-forming processes, including soil parent material, weathering, and soil age, at the continental-scale of the data arrays that were not readily apparent in the more conventionally presented data. Linear discriminant analysis of the data arrays indicates that the majority of the soil samples collected along the transects can be more successfully classified with Level 1 ecological regional-scale classification by the soil geochemistry than soil mineralogy. A primary objective of this study is to discover and describe, in a parsimonious way, geochemical processes that are both independent and inter-dependent and manifested through compositional data including estimates of the elements and corresponding mineralogy. ?? 2010.
Science with High Spatial Resolution Far-Infrared Data
NASA Technical Reports Server (NTRS)
Terebey, Susan (Editor); Mazzarella, Joseph M. (Editor)
1994-01-01
The goal of this workshop was to discuss new science and techniques relevant to high spatial resolution processing of far-infrared data, with particular focus on high resolution processing of IRAS data. Users of the maximum correlation method, maximum entropy, and other resolution enhancement algorithms applicable to far-infrared data gathered at the Infrared Processing and Analysis Center (IPAC) for two days in June 1993 to compare techniques and discuss new results. During a special session on the third day, interested astronomers were introduced to IRAS HIRES processing, which is IPAC's implementation of the maximum correlation method to the IRAS data. Topics discussed during the workshop included: (1) image reconstruction; (2) random noise; (3) imagery; (4) interacting galaxies; (5) spiral galaxies; (6) galactic dust and elliptical galaxies; (7) star formation in Seyfert galaxies; (8) wavelet analysis; and (9) supernova remnants.
NASA Technical Reports Server (NTRS)
2001-01-01
Howmet Research Corporation was the first to commercialize an innovative cast metal technology developed at Auburn University, Auburn, Alabama. With funding assistance from NASA's Marshall Space Flight Center, Auburn University's Solidification Design Center (a NASA Commercial Space Center), developed accurate nickel-based superalloy data for casting molten metals. Through a contract agreement, Howmet used the data to develop computer model predictions of molten metals and molding materials in cast metal manufacturing. Howmet Metal Mold (HMM), part of Howmet Corporation Specialty Products, of Whitehall, Michigan, utilizes metal molds to manufacture net shape castings in various alloys and amorphous metal (metallic glass). By implementing the thermophysical property data from by Auburn researchers, Howmet employs its newly developed computer model predictions to offer customers high-quality, low-cost, products with significantly improved mechanical properties. Components fabricated with this new process replace components originally made from forgings or billet. Compared with products manufactured through traditional casting methods, Howmet's computer-modeled castings come out on top.
Implementing Personalized Medicine in a Cancer Center
Fenstermacher, David A.; Wenham, Robert M.; Rollison, Dana E.; Dalton, William S.
2011-01-01
In 2006, the Moffitt Cancer Center partnered with patients, community clinicians, industry, academia, and seventeen hospitals in the United States to begin a personalized cancer care initiative called Total Cancer Care™ . Total Cancer Care was designed to collect tumor specimens and clinical data throughout a patient’s lifetime with the goal of finding “the right treatment, for the right patient, at the right time.” Because Total Cancer Care is a partnership with the patient and involves collection of clinical data and tumor specimens for research purposes, a formal protocol and patient consent process was developed and an information technology platform was constructed to provide a robust “warehouse” for clinical and molecular profiling data. To date, over 76,000 cancer patients from Moffitt and consortium medical centers have been enrolled in the protocol. The TCC initiative has developed many of the capabilities and resources that are building the foundation of personalized medicine. PMID:22157297
DOT National Transportation Integrated Search
2018-01-09
As required by Federal Aviation Administration Order 8110.4C, Type Certification Process, the Volpe Center Acoustics Facility (Volpe), in support of the Federal Aviation Administration Office of Environment and Energy (AEE), has completed valid...
DOT National Transportation Integrated Search
2017-08-18
As required by Federal Aviation Administration (FAA) Order 8110.4C: Type Certification Process (most recently revised as Change 5, 20 December, 2011), the Volpe Center Acoustics Facility (Volpe), in support of the FAA Office of Environmen...
Quality Space and Launch Requirements, Addendum to AS9100C
2015-05-08
45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45...SMC Space and Missile Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP...occur without any individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved
Alternative Fuels Data Center: Propane Production and Distribution
produced from liquid components recovered during natural gas processing. These components include ethane & Incentives Propane Production and Distribution Propane is a by-product of natural gas processing distribution showing propane originating from three sources: 1) gas well and gas plant, 2) oil well and
Sodium content of popular commercially processed and restaurant foods in the United States
USDA-ARS?s Scientific Manuscript database
Nutrient Data Laboratory (NDL) of the U.S. Department of Agriculture (USDA) in close collaboration with U.S. Center for Disease Control and Prevention is monitoring the sodium content of commercially processed and restaurant foods in the United States. The main purpose of this manuscript is to prov...
USDA-ARS?s Scientific Manuscript database
Precipitation and temperature are two primary drivers that significantly affect hydrologic processes in a watershed. A network of land-based National Climatic Data Center (NCDC) weather stations has been typically used as a primary source of climate input for agro-ecosystem models. However, the ne...
NASA Technical Reports Server (NTRS)
1988-01-01
This report presents the on-going research activities at the NASA Marshall Space Flight Center for the year 1988. The subjects presented are space transportation systems, shuttle cargo vehicle, materials processing in space, environmental data base management, microgravity science, astronomy, astrophysics, solar physics, magnetospheric physics, aeronomy, atomic physics, rocket propulsion, materials and processes, telerobotics, and space systems.
Viirs Land Science Investigator-Led Processing System
NASA Astrophysics Data System (ADS)
Devadiga, S.; Mauoka, E.; Roman, M. O.; Wolfe, R. E.; Kalb, V.; Davidson, C. C.; Ye, G.
2015-12-01
The objective of the NASA's Suomi National Polar Orbiting Partnership (S-NPP) Land Science Investigator-led Processing System (Land SIPS), housed at the NASA Goddard Space Flight Center (GSFC), is to produce high quality land products from the Visible Infrared Imaging Radiometer Suite (VIIRS) to extend the Earth System Data Records (ESDRs) developed from NASA's heritage Earth Observing System (EOS) Moderate Resolution Imaging Spectroradiometer (MODIS) onboard the EOS Terra and Aqua satellites. In this paper we will present the functional description and capabilities of the S-NPP Land SIPS, including system development phases and production schedules, timeline for processing, and delivery of land science products based on coordination with the S-NPP Land science team members. The Land SIPS processing stream is expected to be operational by December 2016, generating land products either using the NASA science team delivered algorithms, or the "best-of" science algorithms currently in operation at NASA's Land Product Evaluation and Algorithm Testing Element (PEATE). In addition to generating the standard land science products through processing of the NASA's VIIRS Level 0 data record, the Land SIPS processing system is also used to produce a suite of near-real time products for NASA's application community. Land SIPS will also deliver the standard products, ancillary data sets, software and supporting documentation (ATBDs) to the assigned Distributed Active Archive Centers (DAACs) for archival and distribution. Quality assessment and validation will be an integral part of the Land SIPS processing system; the former being performed at Land Data Operational Product Evaluation (LDOPE) facility, while the latter under the auspices of the CEOS Working Group on Calibration & Validation (WGCV) Land Product Validation (LPV) Subgroup; adopting the best-practices and tools used to assess the quality of heritage EOS-MODIS products generated at the MODIS Adaptive Processing System (MODAPS).
Process Improvement Tools, Commitment to Change Lead to Serious Turnaround.
Birznieks, Derek; Zane, Richard
2017-05-01
The ED at the University of Colorado Hospital (UCH) has undergone a dramatic transformation in recent years, doubling in size while also using process improvement methods to dramatically reduce wait times, eliminate ambulance diversion, and boost patient satisfaction. Throughout this period, volume has continued to increase while the cost per patient and avoidable hospital admissions have experienced steady declines. Guiding the effort has been a series of core principles, with a particular focus on making sure that all processes are patient-centered. . To begin the improvement effort, ED leaders established a leadership team, and hired a process improvement chief with no previous experience in healthcare to provide fresh, outside perspective on processes. . In addition to mandating that all processes be patient-centered, the other guiding principles included a commitment to use and track data, to speak with one voice, to value everyone's perspective, to deliver high-quality care to all patients, and to set a standard for other academic medical centers. . To get points on the board early and win approval from staff, one of the first changes administrators implemented was to hire scribes for every physician so they wouldn't be bogged down with data input. The approach has essentially paid for itself. . Among the biggest changes was the elimination of triage, a process that improvement teams found no longer added value or quality to the patient experience. . Leadership also has moved to equilibrate the size and staff of the various zones in the ED so that they are more generic and less specialized. The move has facilitated patient flow, enabling patients in zones with resuscitation bays to connect with providers quickly.
NASA Technical Reports Server (NTRS)
Papthakos, L. C.; Briehl, D.
1981-01-01
This is the twelfth of a series of reports which describes the data currently available from GASP, including flight routes and dates, instrumentation, data processing procedures, and data tape specifications. In-situ measurements of atmospheric ozone, cabin ozone, carbon monoxide, water vapor, particles, clouds, condensation nuclei, filter samples and related meteorological and flight information obtained during 1732 flights of aircraft N533PA, N4711U, N655PA, and VH-EBE from January 5, 1978 through October 9, 1978 are reported. These data are now available from the National Climatic Center, Asheville, NC, 22801. In addition to the GASP data, tropopause pressures obtained from time ans space interpolation of National Meteorological Center archived data for the dates of the flights are included.